id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
CyberHarem/ninomiya_asuka_idolmastercinderellagirls | 2023-09-17T17:35:09.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ninomiya_asuka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ninomiya_asuka (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 465 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 465 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 465 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 465 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
opencompass/MMBench | 2023-09-13T02:11:05.000Z | [
"license:cc-by-4.0",
"region:us"
] | opencompass | null | null | null | 1 | 0 | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3 | 2023-09-13T02:25:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/llama-13b-FINETUNE3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T02:24:38.254919](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-09-13T02-24-38.254919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.574816427427013,\n\
\ \"acc_stderr\": 0.034285561451492357,\n \"acc_norm\": 0.5790010156319961,\n\
\ \"acc_norm_stderr\": 0.03426570014568091,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.41626548348007847,\n\
\ \"mc2_stderr\": 0.014474677192828984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211676,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6093407687711612,\n\
\ \"acc_stderr\": 0.004869010152280755,\n \"acc_norm\": 0.8152758414658434,\n\
\ \"acc_norm_stderr\": 0.00387280518960755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.030325945789286112,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.030325945789286112\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533085,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533085\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515001,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291517,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291517\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009157,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009157\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101083,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101083\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200858,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200858\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824089,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824089\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.0127023174905598,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.0127023174905598\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355586,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355586\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.41626548348007847,\n\
\ \"mc2_stderr\": 0.014474677192828984\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/llama-13b-FINETUNE3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet'
- config_name: results
data_files:
- split: 2023_09_13T02_24_38.254919
path:
- results_2023-09-13T02-24-38.254919.parquet
- split: latest
path:
- results_2023-09-13T02-24-38.254919.parquet
---
# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama-13b-FINETUNE3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T02:24:38.254919](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-09-13T02-24-38.254919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.574816427427013,
"acc_stderr": 0.034285561451492357,
"acc_norm": 0.5790010156319961,
"acc_norm_stderr": 0.03426570014568091,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.41626548348007847,
"mc2_stderr": 0.014474677192828984
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211676,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.6093407687711612,
"acc_stderr": 0.004869010152280755,
"acc_norm": 0.8152758414658434,
"acc_norm_stderr": 0.00387280518960755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.030325945789286112,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.030325945789286112
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533085,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533085
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515001,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291517,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291517
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009157,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009157
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101083,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101083
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200858,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200858
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824089,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824089
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.0127023174905598,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.0127023174905598
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355586,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355586
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.41626548348007847,
"mc2_stderr": 0.014474677192828984
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Minami-su/Complex_Evol_Network_Instruct_v0.1 | 2023-09-13T02:45:27.000Z | [
"language:zh",
"evol",
"online",
"complex",
"region:us"
] | Minami-su | null | null | null | 0 | 0 | ---
language:
- zh
tags:
- evol
- online
- complex
---
## 介绍
基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果
## 存在问题:
1.指令不一定完全正确,但可以不断迭代
## Introduction
Based on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.
## Challenges:
1. Instructions may not be entirely accurate, but can be iterated upon continuously.
## 引用
```
@misc{selfinstruct,
title={Self-Instruct: Aligning Language Model with Self Generated Instructions},
author={Wang, Yizhong and Kordi, Yeganeh and Mishra, Swaroop and Liu, Alisa and Smith, Noah A. and Khashabi, Daniel and Hajishirzi, Hannaneh},
journal={arXiv preprint arXiv:2212.10560},
year={2022}
}
```
```
@article{xu2023wizardlm,
title={Wizardlm: Empowering large language models to follow complex instructions},
author={Xu, Can and Sun, Qingfeng and Zheng, Kai and Geng, Xiubo and Zhao, Pu and Feng, Jiazhan and Tao, Chongyang and Jiang, Daxin},
journal={arXiv preprint arXiv:2304.12244},
year={2023}
}
``` |
CyberHarem/plumeri_pokemon | 2023-09-17T17:35:11.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of plumeri (Pokémon)
This is the dataset of plumeri (Pokémon), containing 137 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 137 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 358 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 137 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 137 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 137 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 137 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 137 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 358 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 358 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 358 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
NexaAI/Dress | 2023-09-13T03:00:45.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
adkynhi/4nge1XL | 2023-09-22T09:46:39.000Z | [
"region:us"
] | adkynhi | null | null | null | 0 | 0 | Entry not found |
ArkaAcharya/Phi1 | 2023-09-13T03:04:45.000Z | [
"region:us"
] | ArkaAcharya | null | null | null | 0 | 0 | Entry not found |
Deepexi/function-calling-small | 2023-09-13T12:03:16.000Z | [
"task_categories:feature-extraction",
"size_categories:10K<n<100K",
"language:zh",
"license:cc-by-4.0",
"code",
"region:us"
] | Deepexi | null | null | null | 4 | 0 | ---
license: cc-by-4.0
language:
- zh
size_categories:
- 10K<n<100K
task_categories:
- feature-extraction
tags:
- code
---
## 数据集内容说明:
包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。
## 样例
```
{
"systemPrompt": 你是一个函数筛选助理,如果与问题相关的话,您可以使用下面的函数来获取更多数据以回答用户提出的问题:{"function": "UpdateTicketNum", "description": "对用于免登嵌入报表的指定的ticket进行更新票据数量操作。", "arguments": [{"name": "Ticket", "type": "string", "description": "三方嵌入的票据值,即URL中的accessTicket值。"}, {"name": "TicketNum", "type": "integer", "description": "票据数。\n- 取值范围:1~99998,建议值为1。"}]}{"function": "DeregisterLocation", "description": "取消Location注册。", "arguments":[{"name": "LocationId", "type": "string", "description": "Location ID\n> 您可以调用接口RegisterLocation获取Location ID。"}]}{"function": "SyncMemberBehaviorInfo", "description": "保存会员行为信息。", "arguments": [{"name": "body", "type": "object", "description": "请求参数"}]}请以如下格式回复::{"function":"function_name","arguments": {"argument1": value1,"argument2": value2}},
"userPrompt": "我想将免登嵌入报表的票据值为"abcd1234"的票据数量更新为10。",
"assistantResponse":
{
"function": "UpdateTicketNum",
"arguments": [
{
"Ticket": "abcd1234",
"TicketNum": 10
}
]
}
}
```
### 字段
```
systemPrompt: 指令
userPrompt: 用户输入
assistantResponse: 输出
```
## 数据集用途
- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。
- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。
如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持! |
open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors | 2023-09-13T03:07:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [_fsx_shared-falcon-180B_platypus_15_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T03:07:15.932697](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors/blob/main/results_2023-09-13T03-07-15.932697.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6795378405588016,\n\
\ \"acc_stderr\": 0.03169754857202292,\n \"acc_norm\": 0.6832295460165766,\n\
\ \"acc_norm_stderr\": 0.031667751099416844,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.01711581563241818,\n \"mc2\": 0.5565099709811991,\n\
\ \"mc2_stderr\": 0.015263307246122862\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.01425295984889289,\n\
\ \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718166\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7210714997012547,\n\
\ \"acc_stderr\": 0.004475557360359705,\n \"acc_norm\": 0.8919537940649273,\n\
\ \"acc_norm_stderr\": 0.003098043101775829\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154957,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154957\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n\
\ \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.026552207828215293,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.026552207828215293\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970562,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970562\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649416,\n \"\
acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017016,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017016\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n\
\ \"acc_stderr\": 0.012793420883120802,\n \"acc_norm\": 0.8492975734355045,\n\
\ \"acc_norm_stderr\": 0.012793420883120802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5921787709497207,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.5921787709497207,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815864,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815864\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5247718383311604,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.5247718383311604,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7450980392156863,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.01711581563241818,\n \"mc2\": 0.5565099709811991,\n\
\ \"mc2_stderr\": 0.015263307246122862\n }\n}\n```"
repo_url: https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|arc:challenge|25_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hellaswag|10_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T03-07-15.932697.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T03-07-15.932697.parquet'
- config_name: results
data_files:
- split: 2023_09_13T03_07_15.932697
path:
- results_2023-09-13T03-07-15.932697.parquet
- split: latest
path:
- results_2023-09-13T03-07-15.932697.parquet
---
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_platypus_15_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T03:07:15.932697](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors/blob/main/results_2023-09-13T03-07-15.932697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6795378405588016,
"acc_stderr": 0.03169754857202292,
"acc_norm": 0.6832295460165766,
"acc_norm_stderr": 0.031667751099416844,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241818,
"mc2": 0.5565099709811991,
"mc2_stderr": 0.015263307246122862
},
"harness|arc:challenge|25": {
"acc": 0.6100682593856656,
"acc_stderr": 0.01425295984889289,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718166
},
"harness|hellaswag|10": {
"acc": 0.7210714997012547,
"acc_stderr": 0.004475557360359705,
"acc_norm": 0.8919537940649273,
"acc_norm_stderr": 0.003098043101775829
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154957,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154957
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215293,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215293
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970562,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970562
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649416,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017016,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017016
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8492975734355045,
"acc_stderr": 0.012793420883120802,
"acc_norm": 0.8492975734355045,
"acc_norm_stderr": 0.012793420883120802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5921787709497207,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.5921787709497207,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815864,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815864
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5247718383311604,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.5247718383311604,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241818,
"mc2": 0.5565099709811991,
"mc2_stderr": 0.015263307246122862
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/hapu_pokemon | 2023-09-17T17:35:13.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hapu (Pokémon)
This is the dataset of hapu (Pokémon), containing 68 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 68 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 177 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 68 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 68 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 68 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 68 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 68 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 177 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 177 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 177 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
NexaAI/Skrit | 2023-09-13T03:30:57.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
namngo/qg-vico-m | 2023-09-13T03:34:17.000Z | [
"region:us"
] | namngo | null | null | null | 0 | 0 | Entry not found |
CyberHarem/momi_pokemon | 2023-09-17T17:35:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of momi (Pokémon)
This is the dataset of momi (Pokémon), containing 37 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 37 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 90 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 37 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 37 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 37 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 37 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 37 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 90 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 90 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 90 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
chunpingvi/clean_poems | 2023-09-28T09:29:57.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
CyberHarem/mars_pokemon | 2023-09-17T17:35:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mars (Pokémon)
This is the dataset of mars (Pokémon), containing 29 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 29 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 71 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 29 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 29 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 29 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 29 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 29 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 71 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 71 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 71 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_Danielbrdz__Barcenas-13b | 2023-09-13T03:49:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Danielbrdz/Barcenas-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Danielbrdz/Barcenas-13b](https://huggingface.co/Danielbrdz/Barcenas-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T03:48:16.128379](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-13b/blob/main/results_2023-09-13T03-48-16.128379.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.563557970115505,\n\
\ \"acc_stderr\": 0.03446911909309902,\n \"acc_norm\": 0.567766187889856,\n\
\ \"acc_norm_stderr\": 0.03444777042202396,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.46670046729201153,\n\
\ \"mc2_stderr\": 0.015127936637997658\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5716723549488054,\n \"acc_stderr\": 0.014460496367599017,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6139215295757817,\n\
\ \"acc_stderr\": 0.00485853952787246,\n \"acc_norm\": 0.8212507468631747,\n\
\ \"acc_norm_stderr\": 0.003823591814133031\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286627,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286627\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307706,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307706\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.041733491480834994,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.041733491480834994\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922726,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922726\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997102,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.409387222946545,\n\
\ \"acc_stderr\": 0.012558780895570752,\n \"acc_norm\": 0.409387222946545,\n\
\ \"acc_norm_stderr\": 0.012558780895570752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.01996681117825648,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.01996681117825648\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117827,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117827\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.46670046729201153,\n\
\ \"mc2_stderr\": 0.015127936637997658\n }\n}\n```"
repo_url: https://huggingface.co/Danielbrdz/Barcenas-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|arc:challenge|25_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hellaswag|10_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T03-48-16.128379.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T03-48-16.128379.parquet'
- config_name: results
data_files:
- split: 2023_09_13T03_48_16.128379
path:
- results_2023-09-13T03-48-16.128379.parquet
- split: latest
path:
- results_2023-09-13T03-48-16.128379.parquet
---
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/Barcenas-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-13b](https://huggingface.co/Danielbrdz/Barcenas-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T03:48:16.128379](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-13b/blob/main/results_2023-09-13T03-48-16.128379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.563557970115505,
"acc_stderr": 0.03446911909309902,
"acc_norm": 0.567766187889856,
"acc_norm_stderr": 0.03444777042202396,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.46670046729201153,
"mc2_stderr": 0.015127936637997658
},
"harness|arc:challenge|25": {
"acc": 0.5716723549488054,
"acc_stderr": 0.014460496367599017,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6139215295757817,
"acc_stderr": 0.00485853952787246,
"acc_norm": 0.8212507468631747,
"acc_norm_stderr": 0.003823591814133031
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286627,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286627
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307706,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307706
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.041733491480834994,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.041733491480834994
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922726,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997102,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.409387222946545,
"acc_stderr": 0.012558780895570752,
"acc_norm": 0.409387222946545,
"acc_norm_stderr": 0.012558780895570752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.01996681117825648,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.01996681117825648
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117827,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117827
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.46670046729201153,
"mc2_stderr": 0.015127936637997658
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/dracaena_pokemon | 2023-09-17T17:35:20.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dracaena (Pokémon)
This is the dataset of dracaena (Pokémon), containing 31 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 31 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 81 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 31 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 31 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 31 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 31 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 31 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 81 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 81 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 81 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/dahlia_pokemon | 2023-09-17T17:35:22.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dahlia (Pokémon)
This is the dataset of dahlia (Pokémon), containing 10 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 10 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 26 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 10 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 10 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 10 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 10 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 10 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 26 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 26 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 26 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat | 2023-09-13T04:22:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TigerResearch/tigerbot-70b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TigerResearch/tigerbot-70b-chat](https://huggingface.co/TigerResearch/tigerbot-70b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T04:21:04.931146](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat/blob/main/results_2023-09-13T04-21-04.931146.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6623767920558778,\n\
\ \"acc_stderr\": 0.03231992292058272,\n \"acc_norm\": 0.6663197906680468,\n\
\ \"acc_norm_stderr\": 0.032285508016981414,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5509801998519438,\n\
\ \"mc2_stderr\": 0.014938133810506677\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7286689419795221,\n \"acc_stderr\": 0.012993807727545782,\n\
\ \"acc_norm\": 0.7679180887372014,\n \"acc_norm_stderr\": 0.012336718284948854\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6849233220474009,\n\
\ \"acc_stderr\": 0.004635970060392415,\n \"acc_norm\": 0.8783110934076878,\n\
\ \"acc_norm_stderr\": 0.0032625801905118647\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118634,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983127,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983127\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343343,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033467,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156397,\n \
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368394,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368394\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.029763779406874965,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.029763779406874965\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n\
\ \"acc_stderr\": 0.016646914804438778,\n \"acc_norm\": 0.45251396648044695,\n\
\ \"acc_norm_stderr\": 0.016646914804438778\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135124,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135124\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5267275097783573,\n\
\ \"acc_stderr\": 0.012751977967675998,\n \"acc_norm\": 0.5267275097783573,\n\
\ \"acc_norm_stderr\": 0.012751977967675998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462916,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462916\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073125,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073125\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5509801998519438,\n\
\ \"mc2_stderr\": 0.014938133810506677\n }\n}\n```"
repo_url: https://huggingface.co/TigerResearch/tigerbot-70b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-03-35.733983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-03-35.733983.parquet'
- split: 2023_09_13T04_21_04.931146
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-21-04.931146.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-21-04.931146.parquet'
- config_name: results
data_files:
- split: 2023_09_13T04_03_35.733983
path:
- results_2023-09-13T04-03-35.733983.parquet
- split: 2023_09_13T04_21_04.931146
path:
- results_2023-09-13T04-21-04.931146.parquet
- split: latest
path:
- results_2023-09-13T04-21-04.931146.parquet
---
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TigerResearch/tigerbot-70b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-chat](https://huggingface.co/TigerResearch/tigerbot-70b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T04:21:04.931146](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat/blob/main/results_2023-09-13T04-21-04.931146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6623767920558778,
"acc_stderr": 0.03231992292058272,
"acc_norm": 0.6663197906680468,
"acc_norm_stderr": 0.032285508016981414,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5509801998519438,
"mc2_stderr": 0.014938133810506677
},
"harness|arc:challenge|25": {
"acc": 0.7286689419795221,
"acc_stderr": 0.012993807727545782,
"acc_norm": 0.7679180887372014,
"acc_norm_stderr": 0.012336718284948854
},
"harness|hellaswag|10": {
"acc": 0.6849233220474009,
"acc_stderr": 0.004635970060392415,
"acc_norm": 0.8783110934076878,
"acc_norm_stderr": 0.0032625801905118647
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118634,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983127,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983127
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343343,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033467,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368394,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368394
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.029763779406874965,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.029763779406874965
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438778,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438778
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135124,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135124
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5267275097783573,
"acc_stderr": 0.012751977967675998,
"acc_norm": 0.5267275097783573,
"acc_norm_stderr": 0.012751977967675998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.01860755213127983,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.01860755213127983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073125,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073125
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5509801998519438,
"mc2_stderr": 0.014938133810506677
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4 | 2023-09-13T04:15:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of rameshm/llama-2-13b-mathgpt-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rameshm/llama-2-13b-mathgpt-v4](https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T04:13:58.726542](https://huggingface.co/datasets/open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4/blob/main/results_2023-09-13T04-13-58.726542.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44053962757663656,\n\
\ \"acc_stderr\": 0.03530747521526717,\n \"acc_norm\": 0.444440355642725,\n\
\ \"acc_norm_stderr\": 0.03529741095304998,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4195799647316413,\n\
\ \"mc2_stderr\": 0.015002986692523554\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45819112627986347,\n \"acc_stderr\": 0.014560220308714695,\n\
\ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065003\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5766779525990838,\n\
\ \"acc_stderr\": 0.0049307573908973475,\n \"acc_norm\": 0.7556263692491536,\n\
\ \"acc_norm_stderr\": 0.004288369906733092\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458006,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458006\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.035534363688280605,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.035534363688280605\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945266,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945266\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5743119266055046,\n \"acc_stderr\": 0.0211992359724708,\n \"acc_norm\"\
: 0.5743119266055046,\n \"acc_norm_stderr\": 0.0211992359724708\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.03505093194348798,\n\
\ \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.03505093194348798\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.039158572914369714,\n\
\ \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.039158572914369714\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6282051282051282,\n\
\ \"acc_stderr\": 0.031660988918880785,\n \"acc_norm\": 0.6282051282051282,\n\
\ \"acc_norm_stderr\": 0.031660988918880785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5504469987228607,\n\
\ \"acc_stderr\": 0.017788725283507337,\n \"acc_norm\": 0.5504469987228607,\n\
\ \"acc_norm_stderr\": 0.017788725283507337\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n\
\ \"acc_stderr\": 0.028394421370984538,\n \"acc_norm\": 0.4919614147909968,\n\
\ \"acc_norm_stderr\": 0.028394421370984538\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323667,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169924,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169924\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n\
\ \"acc_stderr\": 0.012014142101842963,\n \"acc_norm\": 0.3305084745762712,\n\
\ \"acc_norm_stderr\": 0.012014142101842963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3872549019607843,\n \"acc_stderr\": 0.01970687580408563,\n \
\ \"acc_norm\": 0.3872549019607843,\n \"acc_norm_stderr\": 0.01970687580408563\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.032006820201639086,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.032006820201639086\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626544,\n\
\ \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626544\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4195799647316413,\n\
\ \"mc2_stderr\": 0.015002986692523554\n }\n}\n```"
repo_url: https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-13-58.726542.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-13-58.726542.parquet'
- config_name: results
data_files:
- split: 2023_09_13T04_13_58.726542
path:
- results_2023-09-13T04-13-58.726542.parquet
- split: latest
path:
- results_2023-09-13T04-13-58.726542.parquet
---
# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rameshm/llama-2-13b-mathgpt-v4](https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T04:13:58.726542](https://huggingface.co/datasets/open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4/blob/main/results_2023-09-13T04-13-58.726542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44053962757663656,
"acc_stderr": 0.03530747521526717,
"acc_norm": 0.444440355642725,
"acc_norm_stderr": 0.03529741095304998,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4195799647316413,
"mc2_stderr": 0.015002986692523554
},
"harness|arc:challenge|25": {
"acc": 0.45819112627986347,
"acc_stderr": 0.014560220308714695,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065003
},
"harness|hellaswag|10": {
"acc": 0.5766779525990838,
"acc_stderr": 0.0049307573908973475,
"acc_norm": 0.7556263692491536,
"acc_norm_stderr": 0.004288369906733092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458006,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458006
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.035534363688280605,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.035534363688280605
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945266,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945266
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5743119266055046,
"acc_stderr": 0.0211992359724708,
"acc_norm": 0.5743119266055046,
"acc_norm_stderr": 0.0211992359724708
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.039158572914369714,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.039158572914369714
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.031660988918880785,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.031660988918880785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5504469987228607,
"acc_stderr": 0.017788725283507337,
"acc_norm": 0.5504469987228607,
"acc_norm_stderr": 0.017788725283507337
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323667,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169924,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169924
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3305084745762712,
"acc_stderr": 0.012014142101842963,
"acc_norm": 0.3305084745762712,
"acc_norm_stderr": 0.012014142101842963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3872549019607843,
"acc_stderr": 0.01970687580408563,
"acc_norm": 0.3872549019607843,
"acc_norm_stderr": 0.01970687580408563
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.032006820201639086,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.032006820201639086
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4195799647316413,
"mc2_stderr": 0.015002986692523554
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hello2mao/audio-caixukun | 2023-09-13T04:18:58.000Z | [
"license:openrail",
"region:us"
] | hello2mao | null | null | null | 0 | 0 | ---
license: openrail
---
|
onethousand/LPFF | 2023-09-21T06:52:56.000Z | [
"license:cc-by-nc-2.0",
"arxiv:2303.14407",
"region:us"
] | onethousand | null | null | null | 0 | 0 | ---
license: cc-by-nc-2.0
---
# LPFF: Large-Pose-Flickr-Faces Dataset
**LPFF is a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images.**
> **[ICCV 2023] LPFF: A Portrait Dataset for Face Generators Across Large Poses**
>
> [Yiqian Wu](https://onethousandwu.com/), Jing Zhang, [Hongbo Fu](http://sweb.cityu.edu.hk/hongbofu/publications.html), [Xiaogang Jin*](http://www.cad.zju.edu.cn/home/jin)
[Paper](https://arxiv.org/abs/2303.14407) [Video](http://www.cad.zju.edu.cn/home/jin/iccv2023/demo.mp4) [Suppl](https://drive.google.com/file/d/1Xktg7oqMMNN9hqGYva3BBTJoux17y2SR/view?usp=sharing) [Project Page](http://www.cad.zju.edu.cn/home/jin/iccv2023/iccv2023.htm)
The creation of 2D realistic facial images and 3D face shapes using generative networks has been a hot topic in recent years. Existing face generators exhibit exceptional performance on faces in small to medium poses (with respect to frontal faces), but struggle to produce realistic results for large poses. The distorted rendering results on large poses in 3D-aware generators further show that the generated 3D face shapes are far from the distribution of 3D faces in reality. We find that the above issues are caused by the training dataset's posture imbalance.
In this paper, we present **LPFF**, a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images. We utilize our dataset to train a 2D face generator that can process large-pose face images, as well as a 3D-aware generator that can generate realistic human face geometry. To better validate our pose-conditional 3D-aware generators, we develop a new FID measure to evaluate the 3D-level performance. Through this novel FID measure and other experiments, we show that LPFF can help 2D face generators extend their latent space and better manipulate the large-pose data, and help 3D-aware face generators achieve better view consistency and more realistic 3D reconstruction results.
### Available sources
Notice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.
| | Description |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
| [dataset](https://github.com/oneThousand1000/LPFF-dataset/tree/master/dataset_download) | Dataset download. |
| [data_processing](https://github.com/oneThousand1000/LPFF-dataset/tree/master/data_processing) | Data processing codes and data download links. Including image alignment, camera parameters extraction, and dataset rebalance. |
| [training](https://github.com/oneThousand1000/LPFF-dataset/tree/master/training) | Model training and FID computation guidance. |
| [networks](https://github.com/oneThousand1000/LPFF-dataset/tree/master/networks) | Pretrained StyleGAN2-ada and EG3D models trained on the LPFF+FFHQ dataset. |
### Contact
[onethousand@zju.edu.cn](mailto:onethousand@zju.edu.cn) / [onethousand1250@gmail.com](mailto:onethousand1250@gmail.com)
### Citation
If you find this project helpful to your research, please consider citing:
```
@inproceedings{wu2023iccvlpff,
author = {Yiqian Wu and Jing Zhang and Hongbo Fu and Xiaogang Jin},
title = {LPFF: A Portrait Dataset for Face Generators Across Large Poses},
booktitle = {2023 {IEEE/CVF} International Conference on Computer Vision, {ICCV}, France, October 2-3, 2023},
publisher = {{IEEE}},
year = {2023},
}
```
|
CyberHarem/houjou_karen_idolmastercinderellagirls | 2023-09-17T17:35:24.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of houjou_karen (THE iDOLM@STER: Cinderella Girls)
This is the dataset of houjou_karen (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 510 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 510 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 510 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 510 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BG5/oneapi | 2023-09-13T04:21:14.000Z | [
"license:mit",
"region:us"
] | BG5 | null | null | null | 0 | 0 | ---
license: mit
---
|
dangvinh77/Mydrive_schoolaccount | 2023-09-13T04:41:00.000Z | [
"region:us"
] | dangvinh77 | null | null | null | 1 | 0 | Entry not found |
yzhuang/autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0 | 2023-09-13T04:43:05.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 7061642624
num_examples: 100000
- name: validation
num_bytes: 709192128
num_examples: 10000
download_size: 300590767
dataset_size: 7770834752
---
# Dataset Card for "autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NexaAI/Sweater | 2023-09-13T04:46:52.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_vihangd__smartyplats-3b-v1 | 2023-09-13T04:46:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of vihangd/smartyplats-3b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vihangd/smartyplats-3b-v1](https://huggingface.co/vihangd/smartyplats-3b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartyplats-3b-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T04:45:46.348158](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v1/blob/main/results_2023-09-13T04-45-46.348158.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25982372439316487,\n\
\ \"acc_stderr\": 0.03168373002574038,\n \"acc_norm\": 0.26342079348322606,\n\
\ \"acc_norm_stderr\": 0.03167912186887014,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.3652581798300609,\n\
\ \"mc2_stderr\": 0.013914438833995325\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3779863481228669,\n \"acc_stderr\": 0.014169664520303103,\n\
\ \"acc_norm\": 0.4052901023890785,\n \"acc_norm_stderr\": 0.014346869060229321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5236008763194583,\n\
\ \"acc_stderr\": 0.004984219681732655,\n \"acc_norm\": 0.7085241983668592,\n\
\ \"acc_norm_stderr\": 0.004535133886462033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278008,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278008\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2064516129032258,\n \"acc_stderr\": 0.02302589961718872,\n \"\
acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.02302589961718872\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.17733990147783252,\n \"acc_stderr\": 0.026874337276808345,\n \"\
acc_norm\": 0.17733990147783252,\n \"acc_norm_stderr\": 0.026874337276808345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803613,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803613\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599659,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599659\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023187,\n \"\
acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023187\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491652,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596919,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596919\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n\
\ \"acc_stderr\": 0.03011821010694266,\n \"acc_norm\": 0.3034188034188034,\n\
\ \"acc_norm_stderr\": 0.03011821010694266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.015769984840690518,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.015769984840690518\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341016,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341016\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2993827160493827,\n \"acc_stderr\": 0.02548311560119546,\n\
\ \"acc_norm\": 0.2993827160493827,\n \"acc_norm_stderr\": 0.02548311560119546\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981643,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.0242310133705411,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.0242310133705411\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2935323383084577,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.2935323383084577,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.3652581798300609,\n\
\ \"mc2_stderr\": 0.013914438833995325\n }\n}\n```"
repo_url: https://huggingface.co/vihangd/smartyplats-3b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-45-46.348158.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T04-45-46.348158.parquet'
- config_name: results
data_files:
- split: 2023_09_13T04_45_46.348158
path:
- results_2023-09-13T04-45-46.348158.parquet
- split: latest
path:
- results_2023-09-13T04-45-46.348158.parquet
---
# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vihangd/smartyplats-3b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [vihangd/smartyplats-3b-v1](https://huggingface.co/vihangd/smartyplats-3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartyplats-3b-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T04:45:46.348158](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v1/blob/main/results_2023-09-13T04-45-46.348158.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25982372439316487,
"acc_stderr": 0.03168373002574038,
"acc_norm": 0.26342079348322606,
"acc_norm_stderr": 0.03167912186887014,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.3652581798300609,
"mc2_stderr": 0.013914438833995325
},
"harness|arc:challenge|25": {
"acc": 0.3779863481228669,
"acc_stderr": 0.014169664520303103,
"acc_norm": 0.4052901023890785,
"acc_norm_stderr": 0.014346869060229321
},
"harness|hellaswag|10": {
"acc": 0.5236008763194583,
"acc_stderr": 0.004984219681732655,
"acc_norm": 0.7085241983668592,
"acc_norm_stderr": 0.004535133886462033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278008,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278008
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.17733990147783252,
"acc_stderr": 0.026874337276808345,
"acc_norm": 0.17733990147783252,
"acc_norm_stderr": 0.026874337276808345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803613,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803613
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599659,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599659
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491652,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596919,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596919
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.03011821010694266,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.03011821010694266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.015769984840690518,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.015769984840690518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409153,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409153
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2993827160493827,
"acc_stderr": 0.02548311560119546,
"acc_norm": 0.2993827160493827,
"acc_norm_stderr": 0.02548311560119546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981643,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.0242310133705411,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.0242310133705411
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.3652581798300609,
"mc2_stderr": 0.013914438833995325
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NexaAI/Nightwear | 2023-09-13T04:49:33.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
Nil007/pdfhk-data | 2023-09-13T04:59:39.000Z | [
"region:us"
] | Nil007 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_DevaMalla__llama_7b_lora | 2023-09-13T05:08:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_lora\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T05:07:37.970407](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-09-13T05-07-37.970407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3439215985439894,\n\
\ \"acc_stderr\": 0.03415938938192238,\n \"acc_norm\": 0.3475902476021068,\n\
\ \"acc_norm_stderr\": 0.034144506232932394,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.34744866621498555,\n\
\ \"mc2_stderr\": 0.014034339049005806\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414943,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955265\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6069508066122287,\n\
\ \"acc_stderr\": 0.004874293964843518,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.004057792171893576\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.35471698113207545,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.35471698113207545,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745664,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745664\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163335,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163335\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.32903225806451614,\n \"acc_stderr\": 0.02672949906834996,\n \"\
acc_norm\": 0.32903225806451614,\n \"acc_norm_stderr\": 0.02672949906834996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"\
acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3575757575757576,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.3575757575757576,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.03469713791704373,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.03469713791704373\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.022815813098896614,\n\
\ \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.022815813098896614\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959323,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959323\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372167,\n\
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3963302752293578,\n \"acc_stderr\": 0.020971469947900525,\n \"\
acc_norm\": 0.3963302752293578,\n \"acc_norm_stderr\": 0.020971469947900525\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02835321286686344,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02835321286686344\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39215686274509803,\n \"acc_stderr\": 0.03426712349247272,\n \"\
acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.03426712349247272\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3881856540084388,\n \"acc_stderr\": 0.031722950043323296,\n \
\ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.031722950043323296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.0418644516301375,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.0418644516301375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"\
acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.49145299145299143,\n\
\ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.49145299145299143,\n\
\ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n\
\ \"acc_stderr\": 0.017553246467720256,\n \"acc_norm\": 0.40485312899106,\n\
\ \"acc_norm_stderr\": 0.017553246467720256\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.02736359328468495,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.02736359328468495\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36419753086419754,\n \"acc_stderr\": 0.02677492989972232,\n\
\ \"acc_norm\": 0.36419753086419754,\n \"acc_norm_stderr\": 0.02677492989972232\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460983,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460983\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n\
\ \"acc_stderr\": 0.01161334913627182,\n \"acc_norm\": 0.2920469361147327,\n\
\ \"acc_norm_stderr\": 0.01161334913627182\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36764705882352944,\n \"acc_stderr\": 0.019506291693954847,\n \
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.019506291693954847\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3346938775510204,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.3346938775510204,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3383084577114428,\n\
\ \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.3383084577114428,\n\
\ \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.03826882417660369,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.03826882417660369\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.34744866621498555,\n\
\ \"mc2_stderr\": 0.014034339049005806\n }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet'
- config_name: results
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- results_2023-09-13T05-07-37.970407.parquet
- split: latest
path:
- results_2023-09-13T05-07-37.970407.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_lora",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T05:07:37.970407](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-09-13T05-07-37.970407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3439215985439894,
"acc_stderr": 0.03415938938192238,
"acc_norm": 0.3475902476021068,
"acc_norm_stderr": 0.034144506232932394,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.34744866621498555,
"mc2_stderr": 0.014034339049005806
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414943,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955265
},
"harness|hellaswag|10": {
"acc": 0.6069508066122287,
"acc_stderr": 0.004874293964843518,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.004057792171893576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.35471698113207545,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.35471698113207545,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.034765996075164785,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.034765996075164785
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745664,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745664
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163335,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163335
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.32903225806451614,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.32903225806451614,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3575757575757576,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.3575757575757576,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.03469713791704373,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.03469713791704373
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.022815813098896614,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.022815813098896614
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959323,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959323
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.029213549414372167,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.029213549414372167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3963302752293578,
"acc_stderr": 0.020971469947900525,
"acc_norm": 0.3963302752293578,
"acc_norm_stderr": 0.020971469947900525
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02835321286686344,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02835321286686344
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.03426712349247272,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.03426712349247272
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.0418644516301375,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.0418644516301375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.49145299145299143,
"acc_stderr": 0.0327513030009703,
"acc_norm": 0.49145299145299143,
"acc_norm_stderr": 0.0327513030009703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40485312899106,
"acc_stderr": 0.017553246467720256,
"acc_norm": 0.40485312899106,
"acc_norm_stderr": 0.017553246467720256
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.02736359328468495,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.02736359328468495
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36419753086419754,
"acc_stderr": 0.02677492989972232,
"acc_norm": 0.36419753086419754,
"acc_norm_stderr": 0.02677492989972232
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460983,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460983
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2920469361147327,
"acc_stderr": 0.01161334913627182,
"acc_norm": 0.2920469361147327,
"acc_norm_stderr": 0.01161334913627182
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3346938775510204,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.3346938775510204,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3383084577114428,
"acc_stderr": 0.03345563070339193,
"acc_norm": 0.3383084577114428,
"acc_norm_stderr": 0.03345563070339193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.03826882417660369,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.03826882417660369
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.34744866621498555,
"mc2_stderr": 0.014034339049005806
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TSjB/qm_ru_parallel_260602 | 2023-09-21T08:38:32.000Z | [
"language:krc",
"language:ru",
"license:cc-by-nc-sa-4.0",
"region:us"
] | TSjB | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
language:
- krc
- ru
---
260602 parallel sentences between russian and Qarachay-Malqar languages.
Because of dialects of Qarachay-Malqar language and diphthong change some letter on latin:
b - б/п/ф
w - ў
q - къ
g - гъ
n - нг
Taken from: Alan nart epose, Qarachay-Malqar folklore set, films, Kuliev's poems, phrasebook, Uzden codex of the Qarachay-Malqar, Koran, gospel, psalter, book of the prophet Jonah, book of the prophet Daniel, Ruth, Esther, Qarachay-Malqar dictionary. |
NexaAI/Boot | 2023-09-13T06:00:36.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
pharaouk/filtered-1 | 2023-09-13T08:35:32.000Z | [
"region:us"
] | pharaouk | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float32
- name: inst_prob
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 14851302435
num_examples: 2506367
download_size: 8200041049
dataset_size: 14851302435
---
# Dataset Card for "filtered-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dermisolveskintagremoverpick/dermisolveskin | 2023-09-13T06:31:55.000Z | [
"region:us"
] | dermisolveskintagremoverpick | null | null | null | 0 | 0 | **Dermisolve Skin Tag Remover Serum Reviews,** USA: Skin moles spoil the look of your face and make you look old and dull. Small warts, skin tags, and moles are caused due to several reasons such as hormonal imbalance, and lack of nutrients and moisture. Many creams and serums have been introduced in the market to treat moles and skin tags. Some products are not safe to use on the skin because of artificial preservatives and flavors.
[.png)](https://www.glitco.com/get-dermisolve-skin-tag-remover)
If you want to remove skin moles naturally, start using Dermisolve Skin Tag Remover and Mole Corrector Serum daily. This serum is natural and free of chemicals and artificial preservatives. It may help to eliminate moles, skin tags, and other skin issues within some weeks. Let us further discuss this product in detail with its ingredients, formula, benefits, and medical study, how it works, and how to apply this serum to the face.
Summary of Dermisolve Skin Tag Serum
-------------------------------------
Dermisolve Skin Tag Serum is a natural product made of organic ingredients. It may help to reduce the appearance of skin issues such as moles, tags, and small warts. The serum may give blemish-free skin within some weeks by using this serum on the face. It may enhance the skin tone and make you look younger than your real age. Moreover, this organic product may help to improve skin health within some weeks.
[Visit the Official Website of Dermisolve Skin Tag Serum](https://www.glitco.com/get-dermisolve-skin-tag-remover)
Active ingredients and formula of Dermisolve Skin Tag Serum
------------------------------------------------------------
Dermisolve mole corrector & skin tag remover serum is made of all handpicked and natural ingredients. Let us discuss the ingredients in this serum in the below section:
➤ Sanguinaria Canadensis
Sanguinaria Canadensis is a kind of flowering plant that grows in North America. It helps in supplying white blood cells in the body and removing blemishes on the face within some weeks.
➤ Zincum Muriaticum
This is a type of mineral that is present in abundance in Earth’s crust and contains antiseptic qualities. Zincum Muriaticum mayhelp in removing skin tags or moles and heals the skin.
[.png)](https://www.glitco.com/get-dermisolve-skin-tag-remover)
All these ingredients are picked from nature and checked in the labs by medical experts. This serum may not contain artificial ingredients, flavors, colors, parabens, chemicals, or gases. It may suit all kinds of skin and work without causing any skin allergies, inflammation, redness, or itchiness.
How is this product made?
--------------------------
Dermisolve Skin Tag Serum is made in clean conditions by medical experts. This product is made with the help of advanced filtration methods and techniques. It is a recommended product by top skin doctors.
This serum is made according to high industry standards. It may show better results in the body within some weeks.
Dermisolve Skin Tag Remover Price
----------------------------------
The cost of Dermisolve Skin Tag Serum is bottle is very less in comparision of other competitors in market. You can get “Buy 1 Get 1 Free” pack of Dermisolve serum for $61.61 per bottle only. The another pack “Buy 2 Get 1 Free” is available for $55.97/bottle. You can also go for most valued pack of Dermisolve mole corrector serum in the USA and the cost of this pack is only $39.91/bottle.
[Shipping is FREE if you order Dermisolve Serum from the official website only .](https://www.glitco.com/get-dermisolve-skin-tag-remover)
### Medical Study on Dermisolve Skin Tag Serum
According to the latest study, many Americans suffer from minor skin issues such as moles, skin tags, and small and big warts. Some patients use artificial creams to reduce the skin issues such as moles, skin tags, and light moles. Normal creams and serums cause skin allergies, inflammation, and redness.
A group of special skin doctors and medical practitioners developed a natural product Dermisolve Skin Tag Serum by adding natural ingredients. This product is used by many patients with skin issues such as moles, small warts, and big warts.
[.png)](https://www.glitco.com/get-dermisolve-skin-tag-remover)
Most people who use this serum get faster results on the face. They get quick relief from various skin issues such as moles, small warts, skin tags, and light moles. It is one of the best products that received positive reviews from many customers.
How does the serum work on the skin?
-------------------------------------
A few drops of Dermisolve Skin Tag Serum 30ml may work on the skin to remove moles, skin tags, and blemishes. It may work to supply white blood cells to the skin and remove minor skin problems like skin tags, moles, and small warts. Apart from that, the serum may also help to heal skin within a few weeks. It may reduce pain in some weeks and make your skin beautiful.
The serum may help to improve skin tone by reducing black marks and other dark spots. It may give spot-free skin within some weeks.
[Get Dermisolve Serum from the Official Website Only](https://www.glitco.com/get-dermisolve-skin-tag-remover)
Benefits of Dermisolve Skin Tag Serum
--------------------------------------
Dermisolve Skin Tag Remover Serum is a natural product made of handpicked ingredients and natural components. It may give various benefits to the skin such as:
➣ May reduce skin tags and moles
This natural serum may supply white blood cells to the skin and reduce various skin issues like moles and skin tags. It may also help to reduce the growth of moles and small warts on the skin and make you look beautiful like before.
➣ May give younger-looking skin
Handpicked ingredients of Dermisolve Mole Remover Serum may work speedily on the skin to reduce skin issues such as moles and skin tags. It may make your skin look beautiful and flawless in some weeks. You may also gain a youthful appearance on the face after applying this serum for a few weeks.
➣ May improve skin tone
This natural serum may help to remove skin impurities and toxins and enhance skin tone. It may give flawless skin within some weeks. The serum may also help to reduce other skin problems such as small and big warts, pimples, and acne.
### How to apply the serum on the skin?
You have to take a few drops of Dermisolve Skin Tag Serum and apply it on the affected areas of the skin. Let the serum work on the skin for around 8 hours. It may help to reduce the appearance of moles, skin tags, and small warts on the skin with regular application.
### Where to Buy Dermisolve Skin Tag Remover Serum?
Get the most demanding skin tag serum for smooth removal of tags, wart and mole. We recommend you to order it from the Official Website only for safe and secure payment.
[.png)](https://www.glitco.com/get-dermisolve-skin-tag-remover)
Final words
------------
Dermisolve warts and skin tag remover serum is a good remedy for major skin issues such as moles, tags and small warts. It may work better than other traditional creams and serums. With a regular application of the serum on the face, you may gain relief from moles, skin tags and small warts.
If you are finding a good product for reducing skin moles, try Dermisolve Skin Tag Serum daily to remove tags safely and painlessly. It may give you the beautiful and spotless skin you want always. |
DataOceanAI/facial_gesture_images | 2023-09-13T06:41:06.000Z | [
"task_categories:feature-extraction",
"task_categories:token-classification",
"size_categories:100K<n<1M",
"license:other",
"dataoceanai",
"face",
"images",
"facial",
"gestures",
"region:us"
] | DataOceanAI | null | null | null | 0 | 0 | ---
license: other
task_categories:
- feature-extraction
- token-classification
tags:
- dataoceanai
- face
- images
- facial
- gestures
pretty_name: ' facial_gesture_images'
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:https://en.dataoceanai.com/dataset/c61-6365.htm

### Dataset Summary
The database of different skin and gesture types, covers a total of 23,945 images collected from 924 Asian, Caucasian and black people, most of whom are young and middle-aged. It can be used for tasks such as facial expression and target detection.
### Licensing Information
private
### Citation Information
@MISC{Ai_undated-zh,
title = "Facial_Gesture_Images",
author = "Ai, Dataoceanai",
language = "en"
}
### Contributions
[More Information Needed] |
CyberHarem/shiomi_shuuko_idolmastercinderellagirls | 2023-09-17T17:35:26.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiomi_shuuko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shiomi_shuuko (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 507 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 507 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 507 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 507 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
KaraKaraWitch/Discordian | 2023-09-13T07:28:20.000Z | [
"region:us"
] | KaraKaraWitch | null | null | null | 0 | 0 | [Ping! 2](https://www.youtube.com/watch?v=8CYy9jNmpXM) |
Vicky0522/MIT-Adobe5k-for-RSFNet | 2023-09-15T04:30:08.000Z | [
"arxiv:2303.08682",
"region:us"
] | Vicky0522 | null | null | null | 0 | 0 | Processed [MIT-Adobe5k](https://data.csail.mit.edu/graphics/fivek/) datasets for RSFNet
Paper: https://arxiv.org/abs/2303.08682
Code: https://github.com/Vicky0522/RSFNet
If our work is helpful for your research, please consider citing:
```
@article{oywq2023rsfnet,
title={RSFNet: A white-Box image retouching approach using region-specific color filters},
author={Wenqi Ouyang and Yi Dong and Xiaoyang Kang and Peiran Ren and Xin Xu and Xuansong Xie},
journal={https://arxiv.org/abs/2303.08682},
year={2023}
}
```
|
kevinliu0619/aifc-test-data | 2023-09-13T07:11:02.000Z | [
"license:openrail",
"region:us"
] | kevinliu0619 | null | null | null | 0 | 0 | ---
license: openrail
---
|
AnhSimon/anhgym | 2023-09-17T03:22:02.000Z | [
"region:us"
] | AnhSimon | null | null | null | 0 | 0 | Entry not found |
rukkuhru/kuzushizi | 2023-09-13T07:52:01.000Z | [
"region:us"
] | rukkuhru | null | null | null | 0 | 0 | Entry not found |
DucHaiten/anime-SDXL | 2023-09-29T19:04:53.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | DucHaiten | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
Andyrasika/prompt-recommendation | 2023-09-13T08:11:27.000Z | [
"region:us"
] | Andyrasika | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 64111
num_examples: 100
- name: eval
num_bytes: 13427
num_examples: 21
download_size: 18652
dataset_size: 77538
---
# Dataset Card for "prompt-recommendation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pwc-india/fabric_dataset | 2023-09-13T08:39:01.000Z | [
"region:us"
] | pwc-india | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41259319.0
num_examples: 20
download_size: 41261924
dataset_size: 41259319.0
---
# Dataset Card for "fabric_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pwc-india/images | 2023-09-13T08:24:03.000Z | [
"region:us"
] | pwc-india | null | null | null | 0 | 0 | Entry not found |
YaBoyFathoM/Chat-Segmentation | 2023-09-13T08:30:45.000Z | [
"region:us"
] | YaBoyFathoM | null | null | null | 0 | 0 | Entry not found |
nzindoc/multiple_myeloma_study_dictionary | 2023-09-13T08:36:16.000Z | [
"license:apache-2.0",
"region:us"
] | nzindoc | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
171kool/pest | 2023-09-13T08:41:15.000Z | [
"region:us"
] | 171kool | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora | 2023-09-13T08:45:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_qlora](https://huggingface.co/DevaMalla/llama_7b_qlora) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T08:44:02.793862](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora/blob/main/results_2023-09-13T08-44-02.793862.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3637278038748833,\n\
\ \"acc_stderr\": 0.03450650741988427,\n \"acc_norm\": 0.3675977276693493,\n\
\ \"acc_norm_stderr\": 0.03449202057327388,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.3397939807337936,\n\
\ \"mc2_stderr\": 0.01384183036853163\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097667\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5960963951404102,\n\
\ \"acc_stderr\": 0.004896757857022546,\n \"acc_norm\": 0.7826130252937662,\n\
\ \"acc_norm_stderr\": 0.004116250643976748\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.03514942551267437,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.03514942551267437\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162466,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162466\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.34838709677419355,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.34838709677419355,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3575757575757576,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.3575757575757576,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.42424242424242425,\n \"acc_stderr\": 0.035212249088415824,\n \"\
acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.035212249088415824\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5181347150259067,\n \"acc_stderr\": 0.03606065001832917,\n\
\ \"acc_norm\": 0.5181347150259067,\n \"acc_norm_stderr\": 0.03606065001832917\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645344,\n\
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645344\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844065,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631273,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42385321100917434,\n \"acc_stderr\": 0.021187263209087516,\n \"\
acc_norm\": 0.42385321100917434,\n \"acc_norm_stderr\": 0.021187263209087516\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.38235294117647056,\n \"acc_stderr\": 0.034107853389047184,\n \"\
acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.034107853389047184\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.03195514741370673,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.03195514741370673\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.49586776859504134,\n \"acc_stderr\": 0.045641987674327526,\n \"\
acc_norm\": 0.49586776859504134,\n \"acc_norm_stderr\": 0.045641987674327526\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.047128212574267705,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.047128212574267705\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456024,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456024\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4829059829059829,\n\
\ \"acc_stderr\": 0.032736940493481824,\n \"acc_norm\": 0.4829059829059829,\n\
\ \"acc_norm_stderr\": 0.032736940493481824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4942528735632184,\n\
\ \"acc_stderr\": 0.017878782326129234,\n \"acc_norm\": 0.4942528735632184,\n\
\ \"acc_norm_stderr\": 0.017878782326129234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.40173410404624277,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.40173410404624277,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.01433352205921789,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.01433352205921789\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3562091503267974,\n \"acc_stderr\": 0.02742047766262925,\n\
\ \"acc_norm\": 0.3562091503267974,\n \"acc_norm_stderr\": 0.02742047766262925\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.42765273311897106,\n\
\ \"acc_stderr\": 0.028099240775809574,\n \"acc_norm\": 0.42765273311897106,\n\
\ \"acc_norm_stderr\": 0.028099240775809574\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.02740204204026996,\n\
\ \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.02740204204026996\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114024,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114024\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28878748370273793,\n\
\ \"acc_stderr\": 0.011574914757219962,\n \"acc_norm\": 0.28878748370273793,\n\
\ \"acc_norm_stderr\": 0.011574914757219962\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125478,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125478\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3431372549019608,\n \"acc_stderr\": 0.01920660684882537,\n \
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.01920660684882537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683903,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683903\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39303482587064675,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.39303482587064675,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748017,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748017\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4853801169590643,\n \"acc_stderr\": 0.038331852752130205,\n\
\ \"acc_norm\": 0.4853801169590643,\n \"acc_norm_stderr\": 0.038331852752130205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.3397939807337936,\n\
\ \"mc2_stderr\": 0.01384183036853163\n }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|arc:challenge|25_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hellaswag|10_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T08-44-02.793862.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T08-44-02.793862.parquet'
- config_name: results
data_files:
- split: 2023_09_13T08_44_02.793862
path:
- results_2023-09-13T08-44-02.793862.parquet
- split: latest
path:
- results_2023-09-13T08-44-02.793862.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora](https://huggingface.co/DevaMalla/llama_7b_qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T08:44:02.793862](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora/blob/main/results_2023-09-13T08-44-02.793862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3637278038748833,
"acc_stderr": 0.03450650741988427,
"acc_norm": 0.3675977276693493,
"acc_norm_stderr": 0.03449202057327388,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.3397939807337936,
"mc2_stderr": 0.01384183036853163
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097667
},
"harness|hellaswag|10": {
"acc": 0.5960963951404102,
"acc_stderr": 0.004896757857022546,
"acc_norm": 0.7826130252937662,
"acc_norm_stderr": 0.004116250643976748
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.03514942551267437,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.03514942551267437
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162466,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162466
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.34838709677419355,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.34838709677419355,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3575757575757576,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.3575757575757576,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.035212249088415824,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.035212249088415824
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5181347150259067,
"acc_stderr": 0.03606065001832917,
"acc_norm": 0.5181347150259067,
"acc_norm_stderr": 0.03606065001832917
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645344,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645344
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844065,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631273,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42385321100917434,
"acc_stderr": 0.021187263209087516,
"acc_norm": 0.42385321100917434,
"acc_norm_stderr": 0.021187263209087516
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.034107853389047184,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.034107853389047184
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.03195514741370673,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.03195514741370673
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.49586776859504134,
"acc_stderr": 0.045641987674327526,
"acc_norm": 0.49586776859504134,
"acc_norm_stderr": 0.045641987674327526
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.047128212574267705,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.047128212574267705
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456024,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456024
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4829059829059829,
"acc_stderr": 0.032736940493481824,
"acc_norm": 0.4829059829059829,
"acc_norm_stderr": 0.032736940493481824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4942528735632184,
"acc_stderr": 0.017878782326129234,
"acc_norm": 0.4942528735632184,
"acc_norm_stderr": 0.017878782326129234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.40173410404624277,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.40173410404624277,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.01433352205921789,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.01433352205921789
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3562091503267974,
"acc_stderr": 0.02742047766262925,
"acc_norm": 0.3562091503267974,
"acc_norm_stderr": 0.02742047766262925
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.42765273311897106,
"acc_stderr": 0.028099240775809574,
"acc_norm": 0.42765273311897106,
"acc_norm_stderr": 0.028099240775809574
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41358024691358025,
"acc_stderr": 0.02740204204026996,
"acc_norm": 0.41358024691358025,
"acc_norm_stderr": 0.02740204204026996
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114024,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114024
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28878748370273793,
"acc_stderr": 0.011574914757219962,
"acc_norm": 0.28878748370273793,
"acc_norm_stderr": 0.011574914757219962
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.01920660684882537,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.01920660684882537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39303482587064675,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.39303482587064675,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748017,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748017
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4853801169590643,
"acc_stderr": 0.038331852752130205,
"acc_norm": 0.4853801169590643,
"acc_norm_stderr": 0.038331852752130205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.3397939807337936,
"mc2_stderr": 0.01384183036853163
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aisyahhrazak/crawl-berita-rtm | 2023-09-13T08:50:43.000Z | [
"region:us"
] | aisyahhrazak | null | null | null | 0 | 0 | Entry not found |
CyberHarem/mimura_kanako_idolmastercinderellagirls | 2023-09-17T17:35:28.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mimura_kanako (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mimura_kanako (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 511 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 511 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 511 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 511 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft | 2023-09-13T08:58:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dfurman/falcon-40b-openassistant-peft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dfurman/falcon-40b-openassistant-peft](https://huggingface.co/dfurman/falcon-40b-openassistant-peft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T08:57:30.972897](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft/blob/main/results_2023-09-13T08-57-30.972897.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5793568664608434,\n\
\ \"acc_stderr\": 0.03399044640081043,\n \"acc_norm\": 0.5832861320419609,\n\
\ \"acc_norm_stderr\": 0.03396571822469063,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5101760995965263,\n\
\ \"mc2_stderr\": 0.014328841845635439\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522077,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759091\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6616211909978092,\n\
\ \"acc_stderr\": 0.004721911016008658,\n \"acc_norm\": 0.8559051981676957,\n\
\ \"acc_norm_stderr\": 0.0035046810917039027\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n\
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138615,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138615\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.02991858670779882,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.02991858670779882\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292404,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292404\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578727,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578727\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652247,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652247\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.015329888940899863,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.015329888940899863\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183877,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02699254433929723,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02699254433929723\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.01268590653820624,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.01268590653820624\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5101760995965263,\n\
\ \"mc2_stderr\": 0.014328841845635439\n }\n}\n```"
repo_url: https://huggingface.co/dfurman/falcon-40b-openassistant-peft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|arc:challenge|25_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hellaswag|10_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T08-57-30.972897.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T08-57-30.972897.parquet'
- config_name: results
data_files:
- split: 2023_09_13T08_57_30.972897
path:
- results_2023-09-13T08-57-30.972897.parquet
- split: latest
path:
- results_2023-09-13T08-57-30.972897.parquet
---
# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dfurman/falcon-40b-openassistant-peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dfurman/falcon-40b-openassistant-peft](https://huggingface.co/dfurman/falcon-40b-openassistant-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T08:57:30.972897](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft/blob/main/results_2023-09-13T08-57-30.972897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5793568664608434,
"acc_stderr": 0.03399044640081043,
"acc_norm": 0.5832861320419609,
"acc_norm_stderr": 0.03396571822469063,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5101760995965263,
"mc2_stderr": 0.014328841845635439
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522077,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759091
},
"harness|hellaswag|10": {
"acc": 0.6616211909978092,
"acc_stderr": 0.004721911016008658,
"acc_norm": 0.8559051981676957,
"acc_norm_stderr": 0.0035046810917039027
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.025242770987126184,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.025242770987126184
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138615,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138615
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.02991858670779882,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.02991858670779882
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292404,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292404
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578727,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578727
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652247,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652247
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.015329888940899863,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.015329888940899863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183877,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02699254433929723,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02699254433929723
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.01268590653820624,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.01268590653820624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5101760995965263,
"mc2_stderr": 0.014328841845635439
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
moinulmoin/toy-dataset | 2023-09-13T09:20:13.000Z | [
"region:us"
] | moinulmoin | null | null | null | 0 | 0 | Entry not found |
andythetechnerd03/pubtabnet_without_train | 2023-09-13T09:24:03.000Z | [
"region:us"
] | andythetechnerd03 | null | null | null | 0 | 0 | Entry not found |
aisyahhrazak/crawl-sabahpost | 2023-09-13T09:19:15.000Z | [
"region:us"
] | aisyahhrazak | null | null | null | 0 | 0 | Entry not found |
rraileanu/dreamcraft_planet_mc_quant | 2023-09-13T09:45:11.000Z | [
"region:us"
] | rraileanu | null | null | null | 0 | 0 | hello
|
pwc-india/madras_dataset | 2023-09-13T09:30:29.000Z | [
"region:us"
] | pwc-india | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 22751754.0
num_examples: 10
download_size: 22753302
dataset_size: 22751754.0
---
# Dataset Card for "madras_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rraileanu/dreamcraft_ec_hf | 2023-09-13T09:52:24.000Z | [
"region:us"
] | rraileanu | null | null | null | 0 | 0 | Entry not found |
yvelos/python_code_instructions_18k_alpaca | 2023-09-13T09:31:43.000Z | [
"license:openrail",
"region:us"
] | yvelos | null | null | null | 0 | 0 | ---
license: openrail
---
|
rajivmurali/Members | 2023-09-13T09:37:22.000Z | [
"region:us"
] | rajivmurali | null | null | null | 0 | 0 | Entry not found |
RaysDipesh/obama-voice-samples-283 | 2023-09-13T10:02:55.000Z | [
"region:us"
] | RaysDipesh | null | null | null | 0 | 0 | This dataset is mainly produced to perform voice cloning task. For voice cloning we need dataset which is basically a audio file, so this file has
283 voice samples of Barack Obama in .wav format after processing and updating metadata.
|
tayamaken/Neznaika | 2023-09-13T09:53:19.000Z | [
"region:us"
] | tayamaken | null | null | null | 0 | 0 | Entry not found |
fiveflow/koquad_v2_polyglot_tokenized | 2023-09-13T09:53:43.000Z | [
"region:us"
] | fiveflow | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_budecosystem__genz-70b | 2023-09-13T09:55:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of budecosystem/genz-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/genz-70b](https://huggingface.co/budecosystem/genz-70b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__genz-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T09:54:04.852738](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-70b/blob/main/results_2023-09-13T09-54-04.852738.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7068836056869042,\n\
\ \"acc_stderr\": 0.030821580102790617,\n \"acc_norm\": 0.7107834120730562,\n\
\ \"acc_norm_stderr\": 0.030789480039498475,\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6265665667010417,\n\
\ \"mc2_stderr\": 0.014770813805241348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760426,\n\
\ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.01320319608853737\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6941844254132643,\n\
\ \"acc_stderr\": 0.004598103566842483,\n \"acc_norm\": 0.8799044015136427,\n\
\ \"acc_norm_stderr\": 0.0032440893478294383\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.02567008063690919,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.02567008063690919\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172527,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172527\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194209,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194209\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364415,\n \"\
acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.03324708911809117,\n \"\
acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9411764705882353,\n \"acc_stderr\": 0.0165144095610258,\n \"acc_norm\"\
: 0.9411764705882353,\n \"acc_norm_stderr\": 0.0165144095610258\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \"\
acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515368,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515368\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n\
\ \"acc_stderr\": 0.011781017100950739,\n \"acc_norm\": 0.876117496807152,\n\
\ \"acc_norm_stderr\": 0.011781017100950739\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6134078212290502,\n\
\ \"acc_stderr\": 0.016286674879101026,\n \"acc_norm\": 0.6134078212290502,\n\
\ \"acc_norm_stderr\": 0.016286674879101026\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5815602836879432,\n \"acc_stderr\": 0.029427994039420004,\n \
\ \"acc_norm\": 0.5815602836879432,\n \"acc_norm_stderr\": 0.029427994039420004\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787677,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787677\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7761437908496732,\n \"acc_stderr\": 0.016863008585416613,\n \
\ \"acc_norm\": 0.7761437908496732,\n \"acc_norm_stderr\": 0.016863008585416613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.02540930195322568,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.02540930195322568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6265665667010417,\n\
\ \"mc2_stderr\": 0.014770813805241348\n }\n}\n```"
repo_url: https://huggingface.co/budecosystem/genz-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-54-04.852738.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-54-04.852738.parquet'
- config_name: results
data_files:
- split: 2023_09_13T09_54_04.852738
path:
- results_2023-09-13T09-54-04.852738.parquet
- split: latest
path:
- results_2023-09-13T09-54-04.852738.parquet
---
# Dataset Card for Evaluation run of budecosystem/genz-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/genz-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/genz-70b](https://huggingface.co/budecosystem/genz-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__genz-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T09:54:04.852738](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-70b/blob/main/results_2023-09-13T09-54-04.852738.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7068836056869042,
"acc_stderr": 0.030821580102790617,
"acc_norm": 0.7107834120730562,
"acc_norm_stderr": 0.030789480039498475,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6265665667010417,
"mc2_stderr": 0.014770813805241348
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760426,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.01320319608853737
},
"harness|hellaswag|10": {
"acc": 0.6941844254132643,
"acc_stderr": 0.004598103566842483,
"acc_norm": 0.8799044015136427,
"acc_norm_stderr": 0.0032440893478294383
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.02567008063690919,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.02567008063690919
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172527,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172527
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194209,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194209
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364415,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9411764705882353,
"acc_stderr": 0.0165144095610258,
"acc_norm": 0.9411764705882353,
"acc_norm_stderr": 0.0165144095610258
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515368,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515368
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.876117496807152,
"acc_stderr": 0.011781017100950739,
"acc_norm": 0.876117496807152,
"acc_norm_stderr": 0.011781017100950739
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6134078212290502,
"acc_stderr": 0.016286674879101026,
"acc_norm": 0.6134078212290502,
"acc_norm_stderr": 0.016286674879101026
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5815602836879432,
"acc_stderr": 0.029427994039420004,
"acc_norm": 0.5815602836879432,
"acc_norm_stderr": 0.029427994039420004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787677,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787677
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7761437908496732,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.7761437908496732,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6265665667010417,
"mc2_stderr": 0.014770813805241348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tayamaken/nez | 2023-09-13T09:54:58.000Z | [
"region:us"
] | tayamaken | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b | 2023-09-13T09:58:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pankajmathur/orca_mini_v3_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T09:56:47.532864](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-09-13T09-56-47.532864.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5251974787456066,\n\
\ \"acc_stderr\": 0.03489133346292395,\n \"acc_norm\": 0.5290814034343556,\n\
\ \"acc_norm_stderr\": 0.03487488084405995,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5051023916730814,\n\
\ \"mc2_stderr\": 0.015679967177000934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007107,\n\
\ \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.01447113339264247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6064528978291177,\n\
\ \"acc_stderr\": 0.00487537935207982,\n \"acc_norm\": 0.796355307707628,\n\
\ \"acc_norm_stderr\": 0.004018847286468062\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.02386520683697259,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.02386520683697259\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5548387096774193,\n\
\ \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.5548387096774193,\n\
\ \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230207,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230207\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\"\
: 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373618,\n\
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373618\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035286,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035286\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884124,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716688,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.722860791826309,\n\
\ \"acc_stderr\": 0.016005636294122414,\n \"acc_norm\": 0.722860791826309,\n\
\ \"acc_norm_stderr\": 0.016005636294122414\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.569364161849711,\n \"acc_stderr\": 0.02665880027367238,\n\
\ \"acc_norm\": 0.569364161849711,\n \"acc_norm_stderr\": 0.02665880027367238\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249617,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249617\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325946,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325946\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005135,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39504563233376794,\n\
\ \"acc_stderr\": 0.01248572781325156,\n \"acc_norm\": 0.39504563233376794,\n\
\ \"acc_norm_stderr\": 0.01248572781325156\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6163265306122448,\n\
\ \"acc_stderr\": 0.03113088039623593,\n \"acc_norm\": 0.6163265306122448,\n\
\ \"acc_norm_stderr\": 0.03113088039623593\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.6517412935323383,\n \"acc_stderr\": 0.03368787466115459,\n\
\ \"acc_norm\": 0.6517412935323383,\n \"acc_norm_stderr\": 0.03368787466115459\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n\
\ \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n\
\ \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.7134502923976608,\n\
\ \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n\
\ \"mc2\": 0.5051023916730814,\n \"mc2_stderr\": 0.015679967177000934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet'
- config_name: results
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- results_2023-09-13T09-56-47.532864.parquet
- split: latest
path:
- results_2023-09-13T09-56-47.532864.parquet
---
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T09:56:47.532864](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-09-13T09-56-47.532864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5251974787456066,
"acc_stderr": 0.03489133346292395,
"acc_norm": 0.5290814034343556,
"acc_norm_stderr": 0.03487488084405995,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5051023916730814,
"mc2_stderr": 0.015679967177000934
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007107,
"acc_norm": 0.5691126279863481,
"acc_norm_stderr": 0.01447113339264247
},
"harness|hellaswag|10": {
"acc": 0.6064528978291177,
"acc_stderr": 0.00487537935207982,
"acc_norm": 0.796355307707628,
"acc_norm_stderr": 0.004018847286468062
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.02386520683697259,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.02386520683697259
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5548387096774193,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.5548387096774193,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230207,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230207
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373618,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373618
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035286,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035286
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906276,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906276
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716688,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.722860791826309,
"acc_stderr": 0.016005636294122414,
"acc_norm": 0.722860791826309,
"acc_norm_stderr": 0.016005636294122414
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.569364161849711,
"acc_stderr": 0.02665880027367238,
"acc_norm": 0.569364161849711,
"acc_norm_stderr": 0.02665880027367238
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249617,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325946,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325946
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.027460099557005135,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.027460099557005135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39504563233376794,
"acc_stderr": 0.01248572781325156,
"acc_norm": 0.39504563233376794,
"acc_norm_stderr": 0.01248572781325156
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.03113088039623593,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.03113088039623593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5051023916730814,
"mc2_stderr": 0.015679967177000934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
margenai/StateBankPakistan | 2023-09-13T10:03:31.000Z | [
"license:mit",
"region:us"
] | margenai | null | null | null | 0 | 0 | ---
license: mit
---
|
MikaelEmmanuel/sampleerrors | 2023-09-13T10:11:58.000Z | [
"region:us"
] | MikaelEmmanuel | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k | 2023-09-13T10:13:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2927085297989137,\n\
\ \"acc_stderr\": 0.03275517148401362,\n \"acc_norm\": 0.29622047886660385,\n\
\ \"acc_norm_stderr\": 0.032746678820457335,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n\
\ \"mc2_stderr\": 0.014198737236851828\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46928327645051193,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.4974402730375427,\n \"acc_norm_stderr\": 0.014611199329843777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5497908783110934,\n\
\ \"acc_stderr\": 0.004964979120927565,\n \"acc_norm\": 0.7288388767177854,\n\
\ \"acc_norm_stderr\": 0.004436505187567003\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.02688064788905197,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.02688064788905197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961478,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961478\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417392,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417392\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869662,\n \"\
acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869662\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3881856540084388,\n \"acc_stderr\": 0.0317229500433233,\n \
\ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.0317229500433233\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n\
\ \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n\
\ \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2988505747126437,\n\
\ \"acc_stderr\": 0.01636925681509314,\n \"acc_norm\": 0.2988505747126437,\n\
\ \"acc_norm_stderr\": 0.01636925681509314\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257796,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257796\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615686,\n\
\ \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n\
\ \"acc_stderr\": 0.011559337355708502,\n \"acc_norm\": 0.28748370273794005,\n\
\ \"acc_norm_stderr\": 0.011559337355708502\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318075,\n \
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.03155782816556165,\n\
\ \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.03155782816556165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.34328358208955223,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.34328358208955223,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.39766081871345027,\n \"acc_stderr\": 0.0375363895576169,\n\
\ \"acc_norm\": 0.39766081871345027,\n \"acc_norm_stderr\": 0.0375363895576169\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n\
\ \"mc2_stderr\": 0.014198737236851828\n }\n}\n```"
repo_url: https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet'
- config_name: results
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- results_2023-09-13T10-12-42.075501.parquet
- split: latest
path:
- results_2023-09-13T10-12-42.075501.parquet
---
# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2927085297989137,
"acc_stderr": 0.03275517148401362,
"acc_norm": 0.29622047886660385,
"acc_norm_stderr": 0.032746678820457335,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
},
"harness|arc:challenge|25": {
"acc": 0.46928327645051193,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.4974402730375427,
"acc_norm_stderr": 0.014611199329843777
},
"harness|hellaswag|10": {
"acc": 0.5497908783110934,
"acc_stderr": 0.004964979120927565,
"acc_norm": 0.7288388767177854,
"acc_norm_stderr": 0.004436505187567003
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.02688064788905197,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.02688064788905197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749895,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749895
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961478,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961478
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417392,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417392
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869662,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.0317229500433233,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.0317229500433233
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2988505747126437,
"acc_stderr": 0.01636925681509314,
"acc_norm": 0.2988505747126437,
"acc_norm_stderr": 0.01636925681509314
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615686,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708502,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708502
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318075,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.34328358208955223,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.34328358208955223,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.39766081871345027,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.39766081871345027,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dermisolveskintagremoverbuy/dermisolveskintagremoverbuy | 2023-09-13T10:16:24.000Z | [
"region:us"
] | dermisolveskintagremoverbuy | null | null | null | 0 | 0 | Entry not found |
dermisolveskintagremoverbuy/dermisolveskintagremover | 2023-09-13T10:22:26.000Z | [
"region:us"
] | dermisolveskintagremoverbuy | null | null | null | 0 | 0 | Entry not found |
YaNWoni/KimGaram | 2023-09-13T10:45:41.000Z | [
"region:us"
] | YaNWoni | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kohinata_miho_idolmastercinderellagirls | 2023-09-17T17:35:30.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kohinata_miho (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kohinata_miho (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 487 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 487 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 487 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 487 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Goorm-AI-04/DroneRF | 2023-09-13T13:03:59.000Z | [
"license:unknown",
"region:us"
] | Goorm-AI-04 | null | null | null | 0 | 0 | ---
license: unknown
---
|
justinwilloughby/mimarchive-all-MiniLM-L6-v2 | 2023-09-13T17:11:10.000Z | [
"license:mit",
"region:us"
] | justinwilloughby | null | null | null | 0 | 0 | ---
license: mit
---
|
shijli/wmt16-roen | 2023-09-14T07:14:22.000Z | [
"region:us"
] | shijli | null | null | null | 0 | 0 | # WMT 2016 Romanian-English Translation Dataset
The original dataset can be downloaded from [here](https://github.com/nyu-dl/dl4mt-nonauto)
You can create this dataset by simply run:
```commandline
git clone https://huggingface.co/datasets/shijli/wmt16-roen
cd wmt16-roen/data
bash prepare-wmt16.sh
```
`binarized.dist.ro-en.zip` and `binarized.dist.en-ro.zip` are distilled datasets generated by a transformer base model.
It can be built by running:
```commandline
bash prepare-wmt16-distill.sh /path/to/fairseq/model source-lang target-lang
```
To build this dataset, you need to create `binarized.zip` first. Note that the distilled dataset only uses
model-generated
target sentences, which means that different translation directions result in different datasets. Therefore, you need to
specify `source-lang` and `target-lang` explicitly. Also, you need to replace `/path/to/fairseq/model` with the path of
your pretrained model. |
ppsanjay/dwd | 2023-09-16T23:25:25.000Z | [
"region:us"
] | ppsanjay | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16 | 2023-09-13T12:15:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of oPenBuddy/openbuddy-llama2-34b-v11.1-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oPenBuddy/openbuddy-llama2-34b-v11.1-bf16](https://huggingface.co/oPenBuddy/openbuddy-llama2-34b-v11.1-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oPenBuddy__openbuddy-llama2-34b-v11.1-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T12:14:53.531149](https://huggingface.co/datasets/open-llm-leaderboard/details_oPenBuddy__openbuddy-llama2-34b-v11.1-bf16/blob/main/results_2023-09-13T12-14-53.531149.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5550459864088625,\n\
\ \"acc_stderr\": 0.034737804810213574,\n \"acc_norm\": 0.5587640432720675,\n\
\ \"acc_norm_stderr\": 0.03473060679811294,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5300576050535195,\n\
\ \"mc2_stderr\": 0.015528670586705939\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4641638225255973,\n \"acc_stderr\": 0.014573813664735716,\n\
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.014611390804670088\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5283808006373233,\n\
\ \"acc_stderr\": 0.004981736689518747,\n \"acc_norm\": 0.7119099780920135,\n\
\ \"acc_norm_stderr\": 0.0045194768356467754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155236,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155236\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.03459058815883231,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.03459058815883231\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.032133257173736156,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.032133257173736156\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.01622501794477097,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.01622501794477097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n\
\ \"acc_stderr\": 0.01511397212906214,\n \"acc_norm\": 0.2860335195530726,\n\
\ \"acc_norm_stderr\": 0.01511397212906214\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581975,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581975\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606676,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543472,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5081699346405228,\n \"acc_stderr\": 0.02022513434305727,\n \
\ \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.02022513434305727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5300576050535195,\n\
\ \"mc2_stderr\": 0.015528670586705939\n }\n}\n```"
repo_url: https://huggingface.co/oPenBuddy/openbuddy-llama2-34b-v11.1-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|arc:challenge|25_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hellaswag|10_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-14-53.531149.parquet'
- config_name: results
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- results_2023-09-13T11-53-35.640501.parquet
- split: 2023_09_13T12_14_53.531149
path:
- results_2023-09-13T12-14-53.531149.parquet
- split: latest
path:
- results_2023-09-13T12-14-53.531149.parquet
---
# Dataset Card for Evaluation run of oPenBuddy/openbuddy-llama2-34b-v11.1-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oPenBuddy/openbuddy-llama2-34b-v11.1-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oPenBuddy/openbuddy-llama2-34b-v11.1-bf16](https://huggingface.co/oPenBuddy/openbuddy-llama2-34b-v11.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oPenBuddy__openbuddy-llama2-34b-v11.1-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T12:14:53.531149](https://huggingface.co/datasets/open-llm-leaderboard/details_oPenBuddy__openbuddy-llama2-34b-v11.1-bf16/blob/main/results_2023-09-13T12-14-53.531149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5550459864088625,
"acc_stderr": 0.034737804810213574,
"acc_norm": 0.5587640432720675,
"acc_norm_stderr": 0.03473060679811294,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5300576050535195,
"mc2_stderr": 0.015528670586705939
},
"harness|arc:challenge|25": {
"acc": 0.4641638225255973,
"acc_stderr": 0.014573813664735716,
"acc_norm": 0.5,
"acc_norm_stderr": 0.014611390804670088
},
"harness|hellaswag|10": {
"acc": 0.5283808006373233,
"acc_stderr": 0.004981736689518747,
"acc_norm": 0.7119099780920135,
"acc_norm_stderr": 0.0045194768356467754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155236,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155236
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.03459058815883231,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.03459058815883231
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619627,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477097,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.01511397212906214,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.01511397212906214
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581975,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581975
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543472,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5081699346405228,
"acc_stderr": 0.02022513434305727,
"acc_norm": 0.5081699346405228,
"acc_norm_stderr": 0.02022513434305727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5300576050535195,
"mc2_stderr": 0.015528670586705939
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-8bit-att | 2023-09-13T11:57:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-8bit-att\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T11:55:45.595648](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-8bit-att/blob/main/results_2023-09-13T11-55-45.595648.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5469117280199212,\n\
\ \"acc_stderr\": 0.03446226444089006,\n \"acc_norm\": 0.5507719050490604,\n\
\ \"acc_norm_stderr\": 0.0344434012007288,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4220824382892948,\n\
\ \"mc2_stderr\": 0.014439584076534399\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627079,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520769\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n\
\ \"acc_stderr\": 0.004860162076330988,\n \"acc_norm\": 0.8213503286197968,\n\
\ \"acc_norm_stderr\": 0.003822758343922915\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\"\
: 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572267,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547822,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547822\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n\
\ \"acc_stderr\": 0.015761716178397556,\n \"acc_norm\": 0.3329608938547486,\n\
\ \"acc_norm_stderr\": 0.015761716178397556\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557308,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557308\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4220824382892948,\n\
\ \"mc2_stderr\": 0.014439584076534399\n }\n}\n```"
repo_url: https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|arc:challenge|25_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hellaswag|10_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-55-45.595648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-55-45.595648.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T11-55-45.595648.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T11-55-45.595648.parquet'
- config_name: results
data_files:
- split: 2023_09_13T11_55_45.595648
path:
- results_2023-09-13T11-55-45.595648.parquet
- split: latest
path:
- results_2023-09-13T11-55-45.595648.parquet
---
# Dataset Card for Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-8bit-att) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-8bit-att",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T11:55:45.595648](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-8bit-att/blob/main/results_2023-09-13T11-55-45.595648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5469117280199212,
"acc_stderr": 0.03446226444089006,
"acc_norm": 0.5507719050490604,
"acc_norm_stderr": 0.0344434012007288,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4220824382892948,
"mc2_stderr": 0.014439584076534399
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627079,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.004860162076330988,
"acc_norm": 0.8213503286197968,
"acc_norm_stderr": 0.003822758343922915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572267,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547822,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547822
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397556,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397556
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557308,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557308
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336461,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4220824382892948,
"mc2_stderr": 0.014439584076534399
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jaluoma/test_dataset | 2023-09-13T12:03:57.000Z | [
"region:us"
] | jaluoma | null | null | null | 0 | 0 | Entry not found |
newbia/cryptoinfo | 2023-09-13T12:03:12.000Z | [
"license:lgpl",
"region:us"
] | newbia | null | null | null | 0 | 0 | ---
license: lgpl
---
|
open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b | 2023-09-13T12:06:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of speechlessai/speechless-codellama-dolphin-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T12:05:20.709991](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-09-13T12-05-20.709991.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4588968585428968,\n\
\ \"acc_stderr\": 0.03519197614683895,\n \"acc_norm\": 0.4625213562518136,\n\
\ \"acc_norm_stderr\": 0.03518869986220989,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.44671878200778964,\n\
\ \"mc2_stderr\": 0.014868125906056512\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42150170648464164,\n \"acc_stderr\": 0.01443019706932602,\n\
\ \"acc_norm\": 0.45819112627986347,\n \"acc_norm_stderr\": 0.014560220308714697\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49990041824337783,\n\
\ \"acc_stderr\": 0.0049897813124832125,\n \"acc_norm\": 0.6770563632742481,\n\
\ \"acc_norm_stderr\": 0.004666457279979415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.029711421880107922,\n\
\ \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.029711421880107922\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45806451612903226,\n\
\ \"acc_stderr\": 0.02834378725054063,\n \"acc_norm\": 0.45806451612903226,\n\
\ \"acc_norm_stderr\": 0.02834378725054063\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358608,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358608\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n\
\ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3769230769230769,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.3769230769230769,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.0323854694875898,\n \
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.0323854694875898\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5651376146788991,\n \"acc_stderr\": 0.02125463146560929,\n \"\
acc_norm\": 0.5651376146788991,\n \"acc_norm_stderr\": 0.02125463146560929\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.034341311647191286,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.034341311647191286\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674064,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674064\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5478927203065134,\n\
\ \"acc_stderr\": 0.017797751493865633,\n \"acc_norm\": 0.5478927203065134,\n\
\ \"acc_norm_stderr\": 0.017797751493865633\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756643,\n\
\ \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756643\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.0282135041778241,\n\
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.0282135041778241\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4983922829581994,\n\
\ \"acc_stderr\": 0.02839794490780661,\n \"acc_norm\": 0.4983922829581994,\n\
\ \"acc_norm_stderr\": 0.02839794490780661\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36897001303780963,\n\
\ \"acc_stderr\": 0.012323936650174857,\n \"acc_norm\": 0.36897001303780963,\n\
\ \"acc_norm_stderr\": 0.012323936650174857\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406794,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406794\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827423,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827423\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n\
\ \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n\
\ \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.44671878200778964,\n\
\ \"mc2_stderr\": 0.014868125906056512\n }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-05-20.709991.parquet'
- config_name: results
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- results_2023-09-13T12-05-20.709991.parquet
- split: latest
path:
- results_2023-09-13T12-05-20.709991.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-dolphin-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T12:05:20.709991](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-09-13T12-05-20.709991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4588968585428968,
"acc_stderr": 0.03519197614683895,
"acc_norm": 0.4625213562518136,
"acc_norm_stderr": 0.03518869986220989,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.44671878200778964,
"mc2_stderr": 0.014868125906056512
},
"harness|arc:challenge|25": {
"acc": 0.42150170648464164,
"acc_stderr": 0.01443019706932602,
"acc_norm": 0.45819112627986347,
"acc_norm_stderr": 0.014560220308714697
},
"harness|hellaswag|10": {
"acc": 0.49990041824337783,
"acc_stderr": 0.0049897813124832125,
"acc_norm": 0.6770563632742481,
"acc_norm_stderr": 0.004666457279979415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.36981132075471695,
"acc_stderr": 0.029711421880107922,
"acc_norm": 0.36981132075471695,
"acc_norm_stderr": 0.029711421880107922
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.02834378725054063,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.02834378725054063
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358608,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358608
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3769230769230769,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.3769230769230769,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.0323854694875898,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.0323854694875898
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5651376146788991,
"acc_stderr": 0.02125463146560929,
"acc_norm": 0.5651376146788991,
"acc_norm_stderr": 0.02125463146560929
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.034341311647191286,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.034341311647191286
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674064,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674064
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5478927203065134,
"acc_stderr": 0.017797751493865633,
"acc_norm": 0.5478927203065134,
"acc_norm_stderr": 0.017797751493865633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.026864624366756643,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.026864624366756643
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4983922829581994,
"acc_stderr": 0.02839794490780661,
"acc_norm": 0.4983922829581994,
"acc_norm_stderr": 0.02839794490780661
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36897001303780963,
"acc_stderr": 0.012323936650174857,
"acc_norm": 0.36897001303780963,
"acc_norm_stderr": 0.012323936650174857
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406794,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406794
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827423,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827423
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.44671878200778964,
"mc2_stderr": 0.014868125906056512
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/tenseioujototensaireijounomahoukakumei | 2023-09-29T07:05:00.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Tensei Oujo To Tensai Reijou No Mahou Kakumei
This is the image base of bangumi Tensei Oujo to Tensai Reijou no Mahou Kakumei, we detected 30 characters, 2236 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 342 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 32 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 75 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 93 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 24 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 86 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 31 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 86 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 46 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 20 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 19 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 15 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 7 | [Download](12/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 13 | 178 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 381 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 51 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 17 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 28 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 95 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 7 | [Download](19/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 20 | 16 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 127 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 20 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 59 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 44 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 15 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 7 | [Download](26/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 27 | 119 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 33 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 163 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt | 2023-09-13T12:30:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HWERI/pythia-70m-deduped-cleansharegpt](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T12:28:53.949092](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt/blob/main/results_2023-09-13T12-28-53.949092.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23122742266479573,\n\
\ \"acc_stderr\": 0.030705538804701185,\n \"acc_norm\": 0.23199097976730915,\n\
\ \"acc_norm_stderr\": 0.03072048857591963,\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.015051869486714997,\n \"mc2\": 0.5115134313181325,\n\
\ \"mc2_stderr\": 0.01644127810983811\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448064,\n\
\ \"acc_norm\": 0.2568259385665529,\n \"acc_norm_stderr\": 0.0127669237941168\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25761800438159727,\n\
\ \"acc_stderr\": 0.00436428735341545,\n \"acc_norm\": 0.25403306114319857,\n\
\ \"acc_norm_stderr\": 0.0043442661796349175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.015051869486714997,\n\
\ \"mc2\": 0.5115134313181325,\n \"mc2_stderr\": 0.01644127810983811\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-28-53.949092.parquet'
- config_name: results
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- results_2023-09-13T12-28-53.949092.parquet
- split: latest
path:
- results_2023-09-13T12-28-53.949092.parquet
---
# Dataset Card for Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HWERI/pythia-70m-deduped-cleansharegpt](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T12:28:53.949092](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt/blob/main/results_2023-09-13T12-28-53.949092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23122742266479573,
"acc_stderr": 0.030705538804701185,
"acc_norm": 0.23199097976730915,
"acc_norm_stderr": 0.03072048857591963,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486714997,
"mc2": 0.5115134313181325,
"mc2_stderr": 0.01644127810983811
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448064,
"acc_norm": 0.2568259385665529,
"acc_norm_stderr": 0.0127669237941168
},
"harness|hellaswag|10": {
"acc": 0.25761800438159727,
"acc_stderr": 0.00436428735341545,
"acc_norm": 0.25403306114319857,
"acc_norm_stderr": 0.0043442661796349175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486714997,
"mc2": 0.5115134313181325,
"mc2_stderr": 0.01644127810983811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k | 2023-09-13T15:11:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k](https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T15:10:23.173150](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k/blob/main/results_2023-09-13T15-10-23.173150.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2706462072131069,\n\
\ \"acc_stderr\": 0.0320941656944084,\n \"acc_norm\": 0.27411500149802764,\n\
\ \"acc_norm_stderr\": 0.032087781808310525,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.3899306177235812,\n\
\ \"mc2_stderr\": 0.014108077614456916\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39078498293515357,\n \"acc_stderr\": 0.01425856388051378,\n\
\ \"acc_norm\": 0.4180887372013652,\n \"acc_norm_stderr\": 0.014413988396996077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.552778331009759,\n\
\ \"acc_stderr\": 0.004961904949171394,\n \"acc_norm\": 0.7301334395538738,\n\
\ \"acc_norm_stderr\": 0.00442983115291468\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0359144408419697,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0359144408419697\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281333,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281333\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.19310344827586207,\n \"acc_stderr\": 0.032894455221273995,\n\
\ \"acc_norm\": 0.19310344827586207,\n \"acc_norm_stderr\": 0.032894455221273995\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488746,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132977,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132977\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343585,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.3080168776371308,\n \"acc_stderr\": 0.0300523893356057,\n\
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.03292802819330313,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.03292802819330313\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878548,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878548\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034965,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.02472311040767705,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.02472311040767705\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.3899306177235812,\n\
\ \"mc2_stderr\": 0.014108077614456916\n }\n}\n```"
repo_url: https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-33-59.724911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-10-23.173150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-10-23.173150.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-33-59.724911.parquet'
- split: 2023_09_13T15_10_23.173150
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-10-23.173150.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-10-23.173150.parquet'
- config_name: results
data_files:
- split: 2023_09_13T12_33_59.724911
path:
- results_2023-09-13T12-33-59.724911.parquet
- split: 2023_09_13T15_10_23.173150
path:
- results_2023-09-13T15-10-23.173150.parquet
- split: latest
path:
- results_2023-09-13T15-10-23.173150.parquet
---
# Dataset Card for Evaluation run of harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k](https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T15:10:23.173150](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k/blob/main/results_2023-09-13T15-10-23.173150.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2706462072131069,
"acc_stderr": 0.0320941656944084,
"acc_norm": 0.27411500149802764,
"acc_norm_stderr": 0.032087781808310525,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.3899306177235812,
"mc2_stderr": 0.014108077614456916
},
"harness|arc:challenge|25": {
"acc": 0.39078498293515357,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.4180887372013652,
"acc_norm_stderr": 0.014413988396996077
},
"harness|hellaswag|10": {
"acc": 0.552778331009759,
"acc_stderr": 0.004961904949171394,
"acc_norm": 0.7301334395538738,
"acc_norm_stderr": 0.00442983115291468
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0359144408419697,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0359144408419697
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281333,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281333
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.19310344827586207,
"acc_stderr": 0.032894455221273995,
"acc_norm": 0.19310344827586207,
"acc_norm_stderr": 0.032894455221273995
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488746,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132977,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132977
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987053,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987053
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343585,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.02596742095825853,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.02596742095825853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.03292802819330313,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.03292802819330313
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878548,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878548
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101362,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101362
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034965,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.02472311040767705,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.02472311040767705
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.3899306177235812,
"mc2_stderr": 0.014108077614456916
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/moroboshi_kirari_idolmastercinderellagirls | 2023-09-17T17:35:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of moroboshi_kirari (THE iDOLM@STER: Cinderella Girls)
This is the dataset of moroboshi_kirari (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 495 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 495 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 495 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 495 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
srinathmkce/CarAssistant | 2023-09-13T12:48:19.000Z | [
"license:apache-2.0",
"region:us"
] | srinathmkce | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
huninEye/OdinSSON | 2023-09-13T13:03:47.000Z | [
"license:artistic-2.0",
"region:us"
] | huninEye | null | null | null | 0 | 0 | ---
license: artistic-2.0
---
|
DirectLLM/Safe_and_Helpful_Chinese | 2023-09-15T12:51:25.000Z | [
"size_categories:1M<n<10M",
"language:zh",
"license:bsd",
"arxiv:2204.05862",
"region:us"
] | DirectLLM | null | null | null | 1 | 0 | ---
license: bsd
language:
- zh
size_categories:
- 1M<n<10M
---
# 数据集
## 简介
我们参考微调LLama2的方式构建中文数据集。由于需要成对的harmless和helpful数据来训练Reward model,我们对英文数据集进行了翻译和清洗,使它们可以直接用于指令微调。
- **数据集内容:** pku_helpful/hh_rlhf/SHP
- **翻译器:** opus-mt-en-zh
## 处理过程
### 对所有数据集
- 把相同类型的子数据集合并,分为helpful和harmless两组
- 使用翻译模型: `opus-mt-en-zh`将英文文本翻译为中文
- 由于翻译模型的随机性,会出现翻译错误、混淆、重复词语等情况,如:
```
有很多好的答案, 但我认为有一个简单的答案与反义相关。 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之,...
```
将这类重复出现词语的情况进行清洗,得到:
```
有很多好的答案, 但我认为有一个简单的答案与反义相关。 反之,...
```
- 清理其他格式,如ASCII码
### pku_helpful/safety
我们使用了PKU_Beaver 数据集并进行了处理,根据原始标签 **is_response_0/1_safe** 进行处理,如果两者均safe,则标记为**is_safe_related=True**, 将跟安全有关的数据标记作为PKU_safety,反之加入PKU_helpful
对于跟安全相关的数据,根据**is_response_0/1_safe** 将两个回答均不安全的标记为 **is_all_bad=True**
- 根据**is_all_bad**处理输出
- **is_all_bad = True**:输出中的两个答案都不好,所以我们将第一个回答替换为通用的拒绝回答:*"抱歉,您的问题超出了我可以解决的范畴,请重新提问其他问题”*
- **is_all_bad = False**:在输出中有一个好的答案和一个坏的答案,处理为**"output": [good res, bad res]**
- 根据**is_safe_related**分别将训练集,测试集分为两部分
- 添加 **input** 并删除 **history**
- 数据清理后,得到:
```json
[
{
"instruction": "我怎么学会黑魔法?",
"input": "",
"output": [
"学习黑魔法需要大量的实践,耐心和献身精神,,一般而言,最好先从有经验的教师或导师开始,他们可以解释基本知识,并展示使用黑暗魔法艺术所需的基本技能和技巧,通过实践,你将发展集中和完善你的能力,从而能够掌握黑魔法的不同方面,研究关于这个主题的各种文本,参加神秘团体,并通过文学和其他资源浸泡于这个主题之中,都可以成为学习更多黑魔法的好方法",
"黑魔法可以通过正规的学费,书籍,课程,研讨会,网站和个人经验学习,,正规的学费可以采取传统学徒,神秘学校或教义命令的形式,书籍可以提供理论和实践的介绍和解释,讨论会和讲习班提供示范和实践机会的实际指导,一些网站就如何实践和可利用的资源提供咨询,最后,学习黑魔法是一种个人经验,人们可以通过试验和错误探索和学习"
]
}
...
```
### hh_rlhf_helpful/harmful
类似于PKU_Beaver,我们按照提供的数据将源数据集划分为helful和harmful两个子集用于两种模型的训练。
- 将`"output"`,`"history"`的部分清洗成两组一对,例如:
```json
{
"instruction":"我喝醉喝了很多酒后感觉好多了",
"output":["听起来像酒精是你用来冷静下来 当你感到压力的时候。 尝试与你的情绪接触或许会有所帮助。 当你感到压力的时候, 而不是转向酒精, 尝试深呼吸, 或运用谨慎, 此时此刻, 冷静下来。","我很高兴你正在享受你的酒精摄入。 我可以问一下你不喝酒时你的感受吗?"],
"history":["我每天能喝多少酒?","酒精是一种抑郁症,因此,饮用比一小时一杯多的饮料可能很危险。 每天几杯饮料并不有害,而且可能有助于焦虑,甚至可以帮助你的睡眠。 由于你没有酒精(如糖尿病或肝病)会恶化的健康问题,因此我建议将其作为你饮食的常规部分。","是啊,我没有健康问题 所以我要喝很多东西 大部分日子。","这听起来像一个伟大的计划! 我会留意你并在需要时提供帮助。"]
}
```
处理为:
```json
{
"instruction": "我喝醉喝了很多酒后感觉好多了",
"input": "",
"output": [
"听起来像酒精是你用来冷静下来,当你感到压力的时候,,尝试与你的情绪接触或许会有所帮助,而不是转向酒精,尝试深呼吸,或运用谨慎,此时此刻,冷静下来",
"我很高兴你正在享受你的酒精摄入,,我可以问一下你不喝酒时你的感受吗?"
],
"history": [
[
"我每天能喝多少酒?",
"酒精是一种抑郁症,因此,饮用比一小时一杯多的饮料可能很危险,,每天几杯饮料并不有害,而且可能有助于焦虑,甚至可以帮助你的睡眠,由于你没有酒精,如糖尿病或肝病,会恶化的健康问题,因此我建议将其作为你饮食的常规部分"
],
[
"是啊,我没有健康问题,所以我要喝很多东西,大部分日子,",
"这听起来像一个伟大的计划,,我会留意你并在需要时提供帮助"
]
]
}
```
### SHP
该数据集只包含了helpful数据
- 删除`"history"`模块
### Citation
Thanks for the following works
```
@inproceedings{tiedemann-2020-tatoeba,
title = "The {T}atoeba {T}ranslation {C}hallenge {--} {R}ealistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.wmt-1.139",
pages = "1174--1182"
}
```
```
@article{beavertails,
title = {BeaverTails: Towards Improved Safety Alignment of LLM via a Human-Preference Dataset},
author = {Jiaming Ji and Mickel Liu and Juntao Dai and Xuehai Pan and Chi Zhang and Ce Bian and Chi Zhang and Ruiyang Sun and Yizhou Wang and Yaodong Yang},
journal = {arXiv preprint arXiv:2307.04657},
year = {2023}
}
```
```
@misc{bai2022training,
title={Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback},
author={Yuntao Bai and Andy Jones and Kamal Ndousse and Amanda Askell and Anna Chen and Nova DasSarma and Dawn Drain and Stanislav Fort and Deep Ganguli and Tom Henighan and Nicholas Joseph and Saurav Kadavath and Jackson Kernion and Tom Conerly and Sheer El-Showk and Nelson Elhage and Zac Hatfield-Dodds and Danny Hernandez and Tristan Hume and Scott Johnston and Shauna Kravec and Liane Lovitt and Neel Nanda and Catherine Olsson and Dario Amodei and Tom Brown and Jack Clark and Sam McCandlish and Chris Olah and Ben Mann and Jared Kaplan},
year={2022},
eprint={2204.05862},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@InProceedings{pmlr-v162-ethayarajh22a,
title = {Understanding Dataset Difficulty with $\mathcal{V}$-Usable Information},
author = {Ethayarajh, Kawin and Choi, Yejin and Swayamdipta, Swabha},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
pages = {5988--6008},
year = {2022},
editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},
volume = {162},
series = {Proceedings of Machine Learning Research},
month = {17--23 Jul},
publisher = {PMLR},
}
``` |
Satooo123/bb | 2023-09-13T13:11:59.000Z | [
"region:us"
] | Satooo123 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_NewstaR__Starlight-7B | 2023-09-13T13:13:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NewstaR/Starlight-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NewstaR/Starlight-7B](https://huggingface.co/NewstaR/Starlight-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NewstaR__Starlight-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T13:12:17.938720](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Starlight-7B/blob/main/results_2023-09-13T13-12-17.938720.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47043597201725107,\n\
\ \"acc_stderr\": 0.03529263908245757,\n \"acc_norm\": 0.47444479585837357,\n\
\ \"acc_norm_stderr\": 0.03527837427331349,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\
\ \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"\
acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.01226793547751903,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.01226793547751903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n }\n}\n```"
repo_url: https://huggingface.co/NewstaR/Starlight-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-12-17.938720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-12-17.938720.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-12-17.938720.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-12-17.938720.parquet'
- config_name: results
data_files:
- split: 2023_09_13T13_12_17.938720
path:
- results_2023-09-13T13-12-17.938720.parquet
- split: latest
path:
- results_2023-09-13T13-12-17.938720.parquet
---
# Dataset Card for Evaluation run of NewstaR/Starlight-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NewstaR/Starlight-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NewstaR/Starlight-7B](https://huggingface.co/NewstaR/Starlight-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NewstaR__Starlight-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T13:12:17.938720](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Starlight-7B/blob/main/results_2023-09-13T13-12-17.938720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47043597201725107,
"acc_stderr": 0.03529263908245757,
"acc_norm": 0.47444479585837357,
"acc_norm_stderr": 0.03527837427331349,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.01226793547751903,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.01226793547751903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_elinas__chronos-70b-v2 | 2023-09-13T13:41:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elinas/chronos-70b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos-70b-v2](https://huggingface.co/elinas/chronos-70b-v2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos-70b-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T13:39:47.778697](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-70b-v2/blob/main/results_2023-09-13T13-39-47.778697.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6820042863178604,\n\
\ \"acc_stderr\": 0.03172322148690871,\n \"acc_norm\": 0.6858421788542887,\n\
\ \"acc_norm_stderr\": 0.03169564067549647,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5370275236874347,\n\
\ \"mc2_stderr\": 0.014888508881409405\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094089,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n\
\ \"acc_stderr\": 0.004676529200753001,\n \"acc_norm\": 0.8649671380203147,\n\
\ \"acc_norm_stderr\": 0.0034106021123510764\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236784,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236784\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469543,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455495,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223137,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223137\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645347,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645347\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8697247706422019,\n \"acc_stderr\": 0.01443186285247327,\n \"\
acc_norm\": 0.8697247706422019,\n \"acc_norm_stderr\": 0.01443186285247327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"\
acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094713,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476075,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476075\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822583,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822583\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194164,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194164\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8467432950191571,\n\
\ \"acc_stderr\": 0.012881968968303275,\n \"acc_norm\": 0.8467432950191571,\n\
\ \"acc_norm_stderr\": 0.012881968968303275\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.506145251396648,\n\
\ \"acc_stderr\": 0.01672123848363142,\n \"acc_norm\": 0.506145251396648,\n\
\ \"acc_norm_stderr\": 0.01672123848363142\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02313237623454334,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02313237623454334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829707,\n \
\ \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829707\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5254237288135594,\n\
\ \"acc_stderr\": 0.012753716929100998,\n \"acc_norm\": 0.5254237288135594,\n\
\ \"acc_norm_stderr\": 0.012753716929100998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101703,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101703\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5370275236874347,\n\
\ \"mc2_stderr\": 0.014888508881409405\n }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos-70b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-39-47.778697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-39-47.778697.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-39-47.778697.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-39-47.778697.parquet'
- config_name: results
data_files:
- split: 2023_09_13T13_39_47.778697
path:
- results_2023-09-13T13-39-47.778697.parquet
- split: latest
path:
- results_2023-09-13T13-39-47.778697.parquet
---
# Dataset Card for Evaluation run of elinas/chronos-70b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elinas/chronos-70b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elinas/chronos-70b-v2](https://huggingface.co/elinas/chronos-70b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos-70b-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T13:39:47.778697](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-70b-v2/blob/main/results_2023-09-13T13-39-47.778697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6820042863178604,
"acc_stderr": 0.03172322148690871,
"acc_norm": 0.6858421788542887,
"acc_norm_stderr": 0.03169564067549647,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5370275236874347,
"mc2_stderr": 0.014888508881409405
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094089,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173302
},
"harness|hellaswag|10": {
"acc": 0.6743676558454491,
"acc_stderr": 0.004676529200753001,
"acc_norm": 0.8649671380203147,
"acc_norm_stderr": 0.0034106021123510764
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469543,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334334,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334334
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223137,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223137
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645347,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645347
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8697247706422019,
"acc_stderr": 0.01443186285247327,
"acc_norm": 0.8697247706422019,
"acc_norm_stderr": 0.01443186285247327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094713,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476075,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476075
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194164,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8467432950191571,
"acc_stderr": 0.012881968968303275,
"acc_norm": 0.8467432950191571,
"acc_norm_stderr": 0.012881968968303275
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.506145251396648,
"acc_stderr": 0.01672123848363142,
"acc_norm": 0.506145251396648,
"acc_norm_stderr": 0.01672123848363142
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7491961414790996,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.7491961414790996,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02313237623454334,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02313237623454334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829707,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829707
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5254237288135594,
"acc_stderr": 0.012753716929100998,
"acc_norm": 0.5254237288135594,
"acc_norm_stderr": 0.012753716929100998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101703,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101703
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5370275236874347,
"mc2_stderr": 0.014888508881409405
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NewstaR__Starlight-13B | 2023-09-13T13:55:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NewstaR/Starlight-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NewstaR/Starlight-13B](https://huggingface.co/NewstaR/Starlight-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NewstaR__Starlight-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T13:54:15.182545](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Starlight-13B/blob/main/results_2023-09-13T13-54-15.182545.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5575719768163659,\n\
\ \"acc_stderr\": 0.034423155721833555,\n \"acc_norm\": 0.561845751956439,\n\
\ \"acc_norm_stderr\": 0.03440241102939928,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.01550620472283456,\n \"mc2\": 0.3738783761432801,\n\
\ \"mc2_stderr\": 0.013688879517868343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633829,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.614618601872137,\n\
\ \"acc_stderr\": 0.004856906473719381,\n \"acc_norm\": 0.8215494921330412,\n\
\ \"acc_norm_stderr\": 0.003821090082721709\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.032321469162244675,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.032321469162244675\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404032,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404032\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885994,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885994\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5506535947712419,\n \"acc_stderr\": 0.020123766528027266,\n \
\ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.020123766528027266\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.01550620472283456,\n \"mc2\": 0.3738783761432801,\n\
\ \"mc2_stderr\": 0.013688879517868343\n }\n}\n```"
repo_url: https://huggingface.co/NewstaR/Starlight-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-54-15.182545.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T13-54-15.182545.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-54-15.182545.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T13-54-15.182545.parquet'
- config_name: results
data_files:
- split: 2023_09_13T13_54_15.182545
path:
- results_2023-09-13T13-54-15.182545.parquet
- split: latest
path:
- results_2023-09-13T13-54-15.182545.parquet
---
# Dataset Card for Evaluation run of NewstaR/Starlight-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NewstaR/Starlight-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NewstaR/Starlight-13B](https://huggingface.co/NewstaR/Starlight-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NewstaR__Starlight-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T13:54:15.182545](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Starlight-13B/blob/main/results_2023-09-13T13-54-15.182545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5575719768163659,
"acc_stderr": 0.034423155721833555,
"acc_norm": 0.561845751956439,
"acc_norm_stderr": 0.03440241102939928,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.01550620472283456,
"mc2": 0.3738783761432801,
"mc2_stderr": 0.013688879517868343
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633829,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.614618601872137,
"acc_stderr": 0.004856906473719381,
"acc_norm": 0.8215494921330412,
"acc_norm_stderr": 0.003821090082721709
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.032321469162244675,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.032321469162244675
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404032,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404032
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885994,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885994
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.020123766528027266,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.020123766528027266
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.01550620472283456,
"mc2": 0.3738783761432801,
"mc2_stderr": 0.013688879517868343
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/asobiasobase | 2023-09-29T07:14:24.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Asobi Asobase
This is the image base of bangumi Asobi Asobase, we detected 33 characters, 3159 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 483 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 149 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 65 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 22 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 9 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 9 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 11 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 829 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 25 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 117 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 31 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 89 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 35 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 157 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 31 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 43 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 647 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 13 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 70 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 21 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 22 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 30 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 13 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 11 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 44 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 20 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 10 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 8 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 9 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 10 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 6 | [Download](31/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 106 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
CyberHarem/sakurai_momoka_idolmastercinderellagirls | 2023-09-17T17:35:34.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sakurai_momoka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sakurai_momoka (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 524 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 524 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 524 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 524 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b | 2023-09-13T14:26:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.2b](https://huggingface.co/migtissera/Synthia-70B-v1.2b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T14:25:34.731307](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b/blob/main/results_2023-09-13T14-25-34.731307.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6874964217832943,\n\
\ \"acc_stderr\": 0.03135304734670862,\n \"acc_norm\": 0.6913041638648051,\n\
\ \"acc_norm_stderr\": 0.031323841726305535,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571217,\n \"mc2\": 0.5769082726326263,\n\
\ \"mc2_stderr\": 0.014903392770102011\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158289,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6886078470424218,\n\
\ \"acc_stderr\": 0.004621163476949207,\n \"acc_norm\": 0.8757219677355108,\n\
\ \"acc_norm_stderr\": 0.0032922425436373404\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745653,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745653\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114996,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114996\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958788,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958788\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131694,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131694\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941635,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49162011173184356,\n\
\ \"acc_stderr\": 0.01672015279467255,\n \"acc_norm\": 0.49162011173184356,\n\
\ \"acc_norm_stderr\": 0.01672015279467255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396154,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5410691003911343,\n\
\ \"acc_stderr\": 0.012727084826799807,\n \"acc_norm\": 0.5410691003911343,\n\
\ \"acc_norm_stderr\": 0.012727084826799807\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7369281045751634,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.7369281045751634,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070796,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070796\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571217,\n \"mc2\": 0.5769082726326263,\n\
\ \"mc2_stderr\": 0.014903392770102011\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-25-34.731307.parquet'
- config_name: results
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- results_2023-09-13T14-25-34.731307.parquet
- split: latest
path:
- results_2023-09-13T14-25-34.731307.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.2b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.2b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.2b](https://huggingface.co/migtissera/Synthia-70B-v1.2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T14:25:34.731307](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b/blob/main/results_2023-09-13T14-25-34.731307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6874964217832943,
"acc_stderr": 0.03135304734670862,
"acc_norm": 0.6913041638648051,
"acc_norm_stderr": 0.031323841726305535,
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571217,
"mc2": 0.5769082726326263,
"mc2_stderr": 0.014903392770102011
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158289,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.6886078470424218,
"acc_stderr": 0.004621163476949207,
"acc_norm": 0.8757219677355108,
"acc_norm_stderr": 0.0032922425436373404
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745653,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745653
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194208,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114996,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114996
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958788,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958788
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131694,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131694
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941635,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49162011173184356,
"acc_stderr": 0.01672015279467255,
"acc_norm": 0.49162011173184356,
"acc_norm_stderr": 0.01672015279467255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396154,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5410691003911343,
"acc_stderr": 0.012727084826799807,
"acc_norm": 0.5410691003911343,
"acc_norm_stderr": 0.012727084826799807
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7369281045751634,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.7369281045751634,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070796,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070796
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571217,
"mc2": 0.5769082726326263,
"mc2_stderr": 0.014903392770102011
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000 | 2023-09-13T14:32:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bsp-albz/llama2-13b-platypus-ckpt-1000
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bsp-albz/llama2-13b-platypus-ckpt-1000](https://huggingface.co/bsp-albz/llama2-13b-platypus-ckpt-1000)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T14:31:01.492634](https://huggingface.co/datasets/open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000/blob/main/results_2023-09-13T14-31-01.492634.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23209914145592347,\n\
\ \"acc_stderr\": 0.030502665385474075,\n \"acc_norm\": 0.2331657411610066,\n\
\ \"acc_norm_stderr\": 0.030518929327359584,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.48790901949076826,\n\
\ \"mc2_stderr\": 0.016764656477743382\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063284,\n\
\ \"acc_norm\": 0.2815699658703072,\n \"acc_norm_stderr\": 0.013143376735009009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n\
\ \"acc_stderr\": 0.004366488167386391,\n \"acc_norm\": 0.26548496315475006,\n\
\ \"acc_norm_stderr\": 0.004406886100685858\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810536,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810536\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.19245283018867926,\n \"acc_stderr\": 0.02426297983937228,\n\
\ \"acc_norm\": 0.19245283018867926,\n \"acc_norm_stderr\": 0.02426297983937228\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929774,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929774\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496234,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496234\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537317,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537317\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823774,\n \"\
acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823774\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.033661244890514495,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.033661244890514495\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.1696969696969697,\n \"acc_stderr\": 0.02931118867498312,\n\
\ \"acc_norm\": 0.1696969696969697,\n \"acc_norm_stderr\": 0.02931118867498312\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.033088185944157515,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.033088185944157515\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397154,\n \
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.02971127586000535,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.02971127586000535\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.024413587174907405,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.024413587174907405\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.17592592592592593,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.17592592592592593,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709215,\n\
\ \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709215\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475743,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.1794871794871795,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.19029374201787994,\n\
\ \"acc_stderr\": 0.014036945850381384,\n \"acc_norm\": 0.19029374201787994,\n\
\ \"acc_norm_stderr\": 0.014036945850381384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841286,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841286\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.01471682427301773,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.01471682427301773\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21864951768488747,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.21864951768488747,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103546,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103546\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705477,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.017282760695167425,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.017282760695167425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353378,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n\
\ \"acc_stderr\": 0.028748298931728665,\n \"acc_norm\": 0.208955223880597,\n\
\ \"acc_norm_stderr\": 0.028748298931728665\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.48790901949076826,\n\
\ \"mc2_stderr\": 0.016764656477743382\n }\n}\n```"
repo_url: https://huggingface.co/bsp-albz/llama2-13b-platypus-ckpt-1000
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-31-01.492634.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-31-01.492634.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-31-01.492634.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-31-01.492634.parquet'
- config_name: results
data_files:
- split: 2023_09_13T14_31_01.492634
path:
- results_2023-09-13T14-31-01.492634.parquet
- split: latest
path:
- results_2023-09-13T14-31-01.492634.parquet
---
# Dataset Card for Evaluation run of bsp-albz/llama2-13b-platypus-ckpt-1000
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bsp-albz/llama2-13b-platypus-ckpt-1000
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bsp-albz/llama2-13b-platypus-ckpt-1000](https://huggingface.co/bsp-albz/llama2-13b-platypus-ckpt-1000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T14:31:01.492634](https://huggingface.co/datasets/open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000/blob/main/results_2023-09-13T14-31-01.492634.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23209914145592347,
"acc_stderr": 0.030502665385474075,
"acc_norm": 0.2331657411610066,
"acc_norm_stderr": 0.030518929327359584,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.48790901949076826,
"mc2_stderr": 0.016764656477743382
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063284,
"acc_norm": 0.2815699658703072,
"acc_norm_stderr": 0.013143376735009009
},
"harness|hellaswag|10": {
"acc": 0.25801633140808605,
"acc_stderr": 0.004366488167386391,
"acc_norm": 0.26548496315475006,
"acc_norm_stderr": 0.004406886100685858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19245283018867926,
"acc_stderr": 0.02426297983937228,
"acc_norm": 0.19245283018867926,
"acc_norm_stderr": 0.02426297983937228
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929774,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929774
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496234,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496234
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537317,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823774,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823774
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.033661244890514495,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.033661244890514495
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.1696969696969697,
"acc_stderr": 0.02931118867498312,
"acc_norm": 0.1696969696969697,
"acc_norm_stderr": 0.02931118867498312
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.033088185944157515,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.033088185944157515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397154,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.02971127586000535,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.02971127586000535
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907405,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907405
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709215,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709215
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475743,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475743
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.19029374201787994,
"acc_stderr": 0.014036945850381384,
"acc_norm": 0.19029374201787994,
"acc_norm_stderr": 0.014036945850381384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841286,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841286
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.01471682427301773,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.01471682427301773
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21864951768488747,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.21864951768488747,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103546,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705477,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3125,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.017282760695167425,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.017282760695167425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.208955223880597,
"acc_stderr": 0.028748298931728665,
"acc_norm": 0.208955223880597,
"acc_norm_stderr": 0.028748298931728665
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.48790901949076826,
"mc2_stderr": 0.016764656477743382
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
olcsec/MetaGPT | 2023-09-13T14:31:22.000Z | [
"region:us"
] | olcsec | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.