id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
niyar/test-tree-rings | 2023-10-09T23:05:28.000Z | [
"region:us"
] | niyar | null | null | null | 0 | 0 | Entry not found |
TokenWhisperer/restaurants-2014-v2 | 2023-10-09T23:33:12.000Z | [
"region:us"
] | TokenWhisperer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2588312
num_examples: 2957
- name: test
num_bytes: 749802
num_examples: 786
download_size: 458040
dataset_size: 3338114
---
# Dataset Card for "restaurants-2014-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part1 | 2023-10-09T23:36:10.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 1281395185
num_examples: 134495
download_size: 1491732485
dataset_size: 1281395185
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b | 2023-10-10T03:10:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-TotSirocco-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T03:08:42.670420](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-10T03-08-42.670420.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6409687009620687,\n\
\ \"acc_stderr\": 0.03296983999596416,\n \"acc_norm\": 0.6449214031358174,\n\
\ \"acc_norm_stderr\": 0.03294705912168925,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46494950680092745,\n\
\ \"mc2_stderr\": 0.014635222672125407\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472439,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6389165504879506,\n\
\ \"acc_stderr\": 0.004793330525656209,\n \"acc_norm\": 0.8422624975104561,\n\
\ \"acc_norm_stderr\": 0.003637497708934041\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.038607315993160904,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.038607315993160904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853035,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853035\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468355,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468355\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n\
\ \"acc_stderr\": 0.015638440380241488,\n \"acc_norm\": 0.3229050279329609,\n\
\ \"acc_norm_stderr\": 0.015638440380241488\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46494950680092745,\n\
\ \"mc2_stderr\": 0.014635222672125407\n }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-TotSirocco-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|arc:challenge|25_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hellaswag|10_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet'
- config_name: results
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- results_2023-10-09T23-41-30.846721.parquet
- split: 2023_10_10T03_08_42.670420
path:
- results_2023-10-10T03-08-42.670420.parquet
- split: latest
path:
- results_2023-10-10T03-08-42.670420.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-TotSirocco-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-TotSirocco-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T03:08:42.670420](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-10T03-08-42.670420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6409687009620687,
"acc_stderr": 0.03296983999596416,
"acc_norm": 0.6449214031358174,
"acc_norm_stderr": 0.03294705912168925,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.46494950680092745,
"mc2_stderr": 0.014635222672125407
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472439,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6389165504879506,
"acc_stderr": 0.004793330525656209,
"acc_norm": 0.8422624975104561,
"acc_norm_stderr": 0.003637497708934041
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853035,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853035
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468355,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468355
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.015638440380241488,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.015638440380241488
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.46494950680092745,
"mc2_stderr": 0.014635222672125407
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ABD7667/fgffgfgf | 2023-10-09T23:49:35.000Z | [
"region:us"
] | ABD7667 | null | null | null | 0 | 0 | Entry not found |
AsAHuman/AnomalyCLIP | 2023-10-10T10:58:04.000Z | [
"region:us"
] | AsAHuman | null | null | null | 0 | 0 | Entry not found |
dylanebert/igf-results | 2023-10-10T01:13:12.000Z | [
"license:mit",
"region:us"
] | dylanebert | null | null | null | 0 | 0 | ---
license: mit
---
|
KenDoStudio/MMVCServerSIO-demo | 2023-10-10T01:02:42.000Z | [
"region:us"
] | KenDoStudio | null | null | null | 0 | 0 | Entry not found |
Ahmedshaaban/anniversary | 2023-10-10T00:39:51.000Z | [
"license:apache-2.0",
"region:us"
] | Ahmedshaaban | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
GoodiniSu/toma | 2023-10-10T01:09:24.000Z | [
"license:mit",
"region:us"
] | GoodiniSu | null | null | null | 0 | 0 | ---
license: mit
---
|
stanmalkinson199/KyleBroflovskiClassic | 2023-10-10T01:43:09.000Z | [
"license:openrail",
"region:us"
] | stanmalkinson199 | null | null | null | 0 | 0 | ---
license: openrail
---
|
roborovski/diffusiondb-seq2seq | 2023-10-10T03:04:26.000Z | [
"region:us"
] | roborovski | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: subject
dtype: string
- name: descriptor
dtype: string
splits:
- name: train
num_bytes: 10079006
num_examples: 93834
download_size: 6236928
dataset_size: 10079006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "diffusiondb-seq2seq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DylanJHJ/EXP4PDS | 2023-10-10T02:29:51.000Z | [
"license:apache-2.0",
"region:us"
] | DylanJHJ | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
LSVR/Vetroquis | 2023-10-10T02:45:50.000Z | [
"region:us"
] | LSVR | null | null | null | 0 | 0 | Entry not found |
sheepy928/purdue_reddit_posts_2017_2022 | 2023-10-10T02:49:57.000Z | [
"region:us"
] | sheepy928 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: created_utc
dtype: timestamp[ns]
- name: url
dtype: string
- name: author
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 25865572
num_examples: 78849
download_size: 15617426
dataset_size: 25865572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "purdue_reddit_posts_2017_2022"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sheepy928/Purdue_reddit_posts_1500 | 2023-10-10T02:50:03.000Z | [
"region:us"
] | sheepy928 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: created_utc
dtype: timestamp[ns]
- name: url
dtype: string
- name: author
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 504948
num_examples: 1500
download_size: 321568
dataset_size: 504948
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Purdue_reddit_posts_1500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LSVR1806/Vetroquis | 2023-10-10T03:03:43.000Z | [
"region:us"
] | LSVR1806 | null | null | null | 0 | 0 | Entry not found |
iara-project/test_split_with_embeddings_bert_base_portuguese | 2023-10-10T03:04:17.000Z | [
"region:us"
] | iara-project | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: news_id
dtype: int64
- name: embeddings
dtype: int64
- name: sentence
dtype: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 588008891
num_examples: 176114
download_size: 365796407
dataset_size: 588008891
---
# Dataset Card for "test_split_with_embeddings_bert_base_portuguese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mmhzlrj/Genealogy | 2023-10-10T03:34:14.000Z | [
"language:zh",
"license:apache-2.0",
"region:us"
] | mmhzlrj | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- zh
---
数据集包含了一本族谱的封面和164页内容,是竖版的中文简体和繁体字的组合。
The dataset contains the cover and 164 pages of a family tree, which is a combination of simplified and traditional Chinese characters in a vertical version. |
SRGui/autotrain-data-resnet50_test | 2023-10-10T05:31:27.000Z | [
"task_categories:image-classification",
"region:us"
] | SRGui | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: resnet50_test
## Dataset Description
This dataset has been automatically processed by AutoTrain for project resnet50_test.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<1920x1920 RGB PIL image>",
"target": 2
},
{
"image": "<1080x721 RGB PIL image>",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['000', '005', '033'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2244 |
| valid | 564 |
|
yuyijiong/LongAlpaca-chinese | 2023-10-10T03:43:02.000Z | [
"task_categories:text-generation",
"language:zh",
"license:cc-by-nc-4.0",
"region:us"
] | yuyijiong | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- zh
---
Yukang/LongAlpaca-12k 谷歌翻译成中文的版本,翻译质量略有待提升,目前勉强能用 \
原数据集已经被拆分为 book_sum、paper_review、paper_compare 三部分,一共约9k的长文本指令微调数据 \
原数据集中还包含3k的抽取自alpaca数据集的短文本指令微调,此项目中不包含这部分alpaca数据集。
|
ilyas3141/ilias_test16 | 2023-10-10T03:45:39.000Z | [
"region:us"
] | ilyas3141 | null | null | null | 0 | 0 | Entry not found |
stanmalkinson199/TweekTweakPTBR | 2023-10-10T03:55:11.000Z | [
"license:openrail",
"region:us"
] | stanmalkinson199 | null | null | null | 0 | 0 | ---
license: openrail
---
|
erhwenkuo/wikinews-zhtw | 2023-10-10T04:06:53.000Z | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:zh",
"license:cc-by-sa-3.0",
"region:us"
] | erhwenkuo | null | null | null | 0 | 0 | ---
dataset_info:
config_name: '20231001'
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 13647957
num_examples: 9827
download_size: 8803739
dataset_size: 13647957
configs:
- config_name: '20231001'
data_files:
- split: train
path: 20231001/train-*
license: cc-by-sa-3.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
# Dataset Card for "wikinews-zhtw"
維基新聞(英文:Wikinews)是由一群志願者、即民間記者運營的網路媒體。同時是一個自由內容的維基,屬維基媒體計劃項目,由維基媒體基金會負責運營。維基新聞通過協作新聞學的工作模式去運行,同時亦努力通過中性的觀點報導新聞,包括原創一手獨家報道和採訪。
這個數據集是根據 Wikipedia dumps (https://dumps.wikimedia.org/) 裡頭 `zhwikinews` 的中文下載檔案來建構的。每個範例都包含一篇完整的維基新聞文章的內容,並經過清理以去除不需要的部分。
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **zhwiki 下載點:** [https://dumps.wikimedia.org/zhwikinews](https://dumps.wikimedia.org/zhwikinews/)
## 數據 Dump 版本
由於維基百科數據集定期會進行網站數據拋轉,在 `2023/10/10` 的時間點去查看時會有下列的數據可供下載:
|數據 Dump 目錄|拋轉時間點|
|-------------|--------|
|`20230520/`|01-Jul-2023 09:28|
|`20230601/`|20-Jul-2023 09:41|
|`20230620/`|01-Aug-2023 09:35|
|`20230701/`|20-Aug-2023 09:49|
|`20230720/`|01-Sep-2023 09:35|
|`20230801/`|20-Sep-2023 09:46|
|`20230820/`|01-Oct-2023 09:42|
|`20230901/`|02-Sep-2023 14:47|
|`20230920/`|21-Sep-2023 14:41|
|`20231001/`|10-Oct-2023 03:50|
|`latest/`|10-Oct-2023 03:50|
本數據集會定期去取得最近有明確的日期來進行下載與清理,便於驗證與使用。
## 數據下載清理
1. 下載 zhwiki 的 data dump 檔案
2. 使用 [WikiExtractor](https://github.com/attardi/wikiextractor) 套件來進行文件內容萃取
3. 進行數據清理并轉換成 jsonl 格式檔案
4. 使用 Huggingface [Datasets](https://pypi.org/project/datasets/) 套件來載入 jsonl 并上傳至 Huggingface Hub
## 資料集結構
範例如下:
{'id': '35',
'url': 'https://zh.wikinews.org/wiki?curid=35',
'title': 'EDWIN與CUELLO遭統一獅隊解約',
'text': '曾經打過中國棒球聯賽的兩位外援球員EDWIN(臺譯:愛力)與CUELLO(臺譯:阿-{A|裡}-),昨天傳出...'
}
## 資料欄位
所有配置中的資料欄位都是相同的:
- `id (str)`: 文章的 ID。
- `url (str)`: 文章的 URL。
- `title (str)`: 文章的標題。
- `text (str)`: 文章的文字內容。
## 使用
```python
from datasets import load_dataset
# 請在第二個參數去指定要使用的數據 dump 的日期
load_dataset("erhwenkuo/wikinews-zhtw", "20231001")
```
## 許可資訊
維基百科的大部分文章內容及其許多圖像均根據 `Creative Commons Attribution-ShareAlike 3.0 Unported License (CC BY-SA)` 和 `GNU Free Documentation License (GFDL)` 共同授權。
## Citation
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
``` |
open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b | 2023-10-10T04:06:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-AdventurousWinds-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T04:04:57.551374](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-10T04-04-57.551374.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6358155948116495,\n\
\ \"acc_stderr\": 0.03289402415505117,\n \"acc_norm\": 0.6398195855936059,\n\
\ \"acc_norm_stderr\": 0.03287193782770926,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.42654663056025405,\n\
\ \"mc2_stderr\": 0.014166995095721226\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642471,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892889\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6394144592710616,\n\
\ \"acc_stderr\": 0.004791890625834199,\n \"acc_norm\": 0.8346942840071699,\n\
\ \"acc_norm_stderr\": 0.003706970856410961\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.01544571691099888,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.01544571691099888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032204,\n\
\ \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032204\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.42654663056025405,\n\
\ \"mc2_stderr\": 0.014166995095721226\n }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet'
- config_name: results
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- results_2023-10-10T04-04-57.551374.parquet
- split: latest
path:
- results_2023-10-10T04-04-57.551374.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-AdventurousWinds-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T04:04:57.551374](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-10T04-04-57.551374.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6358155948116495,
"acc_stderr": 0.03289402415505117,
"acc_norm": 0.6398195855936059,
"acc_norm_stderr": 0.03287193782770926,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394805,
"mc2": 0.42654663056025405,
"mc2_stderr": 0.014166995095721226
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642471,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892889
},
"harness|hellaswag|10": {
"acc": 0.6394144592710616,
"acc_stderr": 0.004791890625834199,
"acc_norm": 0.8346942840071699,
"acc_norm_stderr": 0.003706970856410961
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.01544571691099888,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.01544571691099888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032204,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032204
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394805,
"mc2": 0.42654663056025405,
"mc2_stderr": 0.014166995095721226
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ilyas3141/ilias_test17 | 2023-10-10T06:26:00.000Z | [
"region:us"
] | ilyas3141 | null | null | null | 0 | 0 | Entry not found |
unoooo/ko-alpaca | 2023-10-10T04:30:34.000Z | [
"region:us"
] | unoooo | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_maywell__Synatra-V0.1-7B | 2023-10-10T04:32:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of maywell/Synatra-V0.1-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/Synatra-V0.1-7B](https://huggingface.co/maywell/Synatra-V0.1-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-V0.1-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T04:30:58.971713](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B/blob/main/results_2023-10-10T04-30-58.971713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5526128967911348,\n\
\ \"acc_stderr\": 0.03462386322845972,\n \"acc_norm\": 0.5565308458707837,\n\
\ \"acc_norm_stderr\": 0.0346104765546302,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\
\ \"mc2_stderr\": 0.015250255723495946\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490975,\n\
\ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5701055566620196,\n\
\ \"acc_stderr\": 0.004940490508240653,\n \"acc_norm\": 0.7662816172077276,\n\
\ \"acc_norm_stderr\": 0.004223302177263008\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296736,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\
: 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"\
acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\
\ \"acc_stderr\": 0.01574549716904904,\n \"acc_norm\": 0.7369093231162197,\n\
\ \"acc_norm_stderr\": 0.01574549716904904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497717,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614098,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\
\ \"acc_stderr\": 0.012520315120147103,\n \"acc_norm\": 0.4015645371577575,\n\
\ \"acc_norm_stderr\": 0.012520315120147103\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\
\ \"mc2_stderr\": 0.015250255723495946\n }\n}\n```"
repo_url: https://huggingface.co/maywell/Synatra-V0.1-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-30-58.971713.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-30-58.971713.parquet'
- config_name: results
data_files:
- split: 2023_10_10T04_30_58.971713
path:
- results_2023-10-10T04-30-58.971713.parquet
- split: latest
path:
- results_2023-10-10T04-30-58.971713.parquet
---
# Dataset Card for Evaluation run of maywell/Synatra-V0.1-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maywell/Synatra-V0.1-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [maywell/Synatra-V0.1-7B](https://huggingface.co/maywell/Synatra-V0.1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-V0.1-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T04:30:58.971713](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B/blob/main/results_2023-10-10T04-30-58.971713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5526128967911348,
"acc_stderr": 0.03462386322845972,
"acc_norm": 0.5565308458707837,
"acc_norm_stderr": 0.0346104765546302,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431445,
"mc2": 0.557562665558094,
"mc2_stderr": 0.015250255723495946
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490975,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526848
},
"harness|hellaswag|10": {
"acc": 0.5701055566620196,
"acc_stderr": 0.004940490508240653,
"acc_norm": 0.7662816172077276,
"acc_norm_stderr": 0.004223302177263008
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.034524539038220406,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.034524539038220406
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296736,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7369093231162197,
"acc_stderr": 0.01574549716904904,
"acc_norm": 0.7369093231162197,
"acc_norm_stderr": 0.01574549716904904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497717,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.02755994980234782,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.02755994980234782
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.012520315120147103,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.012520315120147103
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492523,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492523
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431445,
"mc2": 0.557562665558094,
"mc2_stderr": 0.015250255723495946
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
infCapital/finance-alpaca_vi | 2023-10-10T04:40:29.000Z | [
"task_categories:question-answering",
"task_categories:text-generation",
"language:vi",
"license:apache-2.0",
"region:us"
] | infCapital | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 48252402
num_examples: 66665
download_size: 24622108
dataset_size: 48252402
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- vi
---
# Dataset Card for "finance-alpaca_vi"
+ Origin dataset [finance-alpaca](https://huggingface.co/datasets/gbharti/finance-alpaca )
+ Translated into Vietnamese using OpenAI GPT3.5 API |
chrisdosheavymetal/RENAN | 2023-10-10T04:50:59.000Z | [
"region:us"
] | chrisdosheavymetal | null | null | null | 0 | 0 | Entry not found |
michaelginn/childes_phones | 2023-10-10T22:57:03.000Z | [
"region:us"
] | michaelginn | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: line
dtype: string
- name: file
dtype: string
- name: ipa
dtype: string
splits:
- name: train
num_bytes: 1979606
num_examples: 28466
download_size: 932024
dataset_size: 1979606
---
# Dataset Card for "childes_phones"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dparksports/embedded_faqs_medicare | 2023-10-10T05:05:42.000Z | [
"region:us"
] | dparksports | null | null | null | 0 | 0 | Entry not found |
songys/ChatbotData | 2023-10-10T05:24:31.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | songys | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
# Chatbot_data.
Chatbot_data_for_Korean v1.0
## Data description.
인공데이터입니다. 일부 이별과 관련된 질문에서 다음카페 "사랑보다 아름다운 실연( http://cafe116.daum.net/_c21_/home?grpid=1bld )"에서 자주 나오는 이야기들을 참고하여 제작하였습니다.
가령 "이별한 지 열흘(또는 100일) 되었어요"라는 질문에 챗봇이 위로한다는 취지로 답변을 작성하였습니다.
1. 챗봇 트레이닝용 문답 페어 11,876개
2. 일상다반사 0, 이별(부정) 1, 사랑(긍정) 2로 레이블링
#인용
Youngsook Song.(2018). Chatbot_data_for_Korean v1.0)[Online]. Available : https://github.com/songys/Chatbot_data (downloaded 2022. June. 29.)
|
SRGui/simple_cn_food_demo | 2023-10-10T05:46:24.000Z | [
"task_categories:image-classification",
"region:us"
] | SRGui | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: demo-resnet50-test
## Dataset Description
This dataset has been automatically processed by AutoTrain for project demo-resnet50-test.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<600x600 RGB PIL image>",
"target": 0
},
{
"image": "<600x799 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['000', '005', '033'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 288 |
| valid | 75 |
|
songys/Ko_humane_right_copus | 2023-10-10T05:47:59.000Z | [
"license:cc-by-sa-3.0",
"region:us"
] | songys | null | null | null | 1 | 0 | ---
license: cc-by-sa-3.0
---
# HRC: Building a human rights corpus for interactive generation models
#대화형 생성 모델을 위한 인권코퍼스 구축
## 참조 데이터
- 대한민국 [국가인권위원회](https://case.humanrights.go.kr/dici/diciList.do)의 결정례와 상담사례 참조
- 문체 변경과 질의 응답으로 변경하기 위해서 전후 맥락을 고려한 예시문을 만들고 GPT-3.5-turbo 을 이용하여 원샷 학습후 문답 생성
## 데이터 구조
- 데이터 구조 : source_copus---counsel.jsonl
---decision.jsonl
humane_right_copus_v1.jsonl
## 프롬프트 예시
```
[상담례 prompt]
주어진 상담 문서를 자연스러운 질문, 답변 형태로 변형해 주세요.
답변이 끝나면 '#####'를 작성해 주세요.
반드시 원본의 답변 내용을 기반으로 답변해야 합니다.
질문은 최대한 간결하게 작성해 주세요.
#####
상담 내용: 시청 앞 광장에서 노동조합이 기자회견을 하고 있습니다. 그런데 경찰 세 개 중대가 기자회견장을 에워싸고 집회를 해산하라는 방송을 계속 내보내고 있습니다. 우리는 기자회견을 하고 있으며 설령 집회라 하더라도 평화적으로 하는 집회에 대해 해산 명령을 하는 것은 부당합니다.
원본 상담 답변: 신고를 하지 아니하였다는 이유만으로 개최가 허용되지 않는 집회 내지 시위라고 단정할 수 없습니다. 기자회견이 개인 발언 및 구호 제창 등 미신고 옥외집회의 성격을 띠었다고 하더라도 헌법상 표현의 자유와 집회 시위의 자유라는 민주주의의 근간을 이루는 국민의 기본권이 최대한 보장되도록 해야 합니다. 참고로 대법원(2012도9776)은 “기자회견이 집회에 해당한다 하더라도, 해산명령불응죄가 성립하려면 집회로 인해 타인의 법익이나 공공의 안녕질서에 대한 직접적인 위험이 명백하게 초래된 경우에 해당”하는 것으로 판단한 바 있습니다. 다만, 기자회견이라는 이름으로 개최되고, 일부 기자회견의 성격이 가미되었다고 하더라도 집회로서의 성격을 부인할 수 없다면 신고의 의무까지 면제된다고 보기는 어려울 수 있습니다.
간결한 질문: 시청 앞 광장에서 노동조합이 기자회견을 진행 중입니다. 그런데 경찰이 기자회견장을 둘러싸고 해산하라는 방송을 내보내고 있습니다. 이런 상황에서 경찰의 해산 명령은 정당한 것인가요?
답변: 신고를 하지 않은 집회가 허용되지 않는다는 단순한 이유만으로 기자회견을 미신고 옥외집회로 판단하는 것은 부적절합니다. 기자회견도 표현의 자유와 집회 시위의 자유, 민주주의의 핵심적인 국민의 기본권에 해당하므로 최대한 보장되어야 합니다. 대법원의 판례(2012도9776)에 따르면, "기자회견이 집회에 해당한다 해도, 그 집회로 인해 다른 사람의 법적 이익이나 공공 안녕질서에 대한 명백한 위험이 발생해야 해산명령불응죄가 성립한다"고 하였습니다. 그렇지만, 기자회견이 집회의 성격을 함께 갖는 경우 신고 의무를 면제받는 것은 아닙니다.
#####
상담 내용: {}
원본 상담 답변: {}
간결한 질문:
[결정례 prompt]
주어진 요청, 근거, 결론을 참고하여, 새로운 법률 관련 질문과 답변을 만들어주세요.
답변은 500자 이내로 작성해주세요.
반드시 주어진 자료의 사실을 활용해야 합니다.
답변 끝나면, '*****' 를 작성해주세요.
*****
요청:
진정인은 사회복무요원으로 피진정기관에서 업무 보조 및 폐의약품 수거 일을 하였다. 2021. 7. 26.폐의약품 수거를 위해 진정인의 자전거를 타고 인도 위를 지나다 뛰어오던 행인을 치는 교통사고를 내어 벌금형40만을 선고받았다.교통사고 피해자는 전치 6주 진단을 받았고, 1,30만원가량의 국가배상 청구를 신청한 상태이며, 피진정인은 진정인에게중과실 책임이 있기 때문에 구상권을 청구할 수 있다고 한다.진정인이 인도에서 자전거를 운행한 것은 잘못이지만, 공무 중 발생한 사고에 대해 사회복무요원이 모든 책임을 지는 것은 부당하다. 피진정기관이 진정인에게 구상권을 행사하지 않도록 도와주기 바란다.
근거:
1. 진정인이 자전거로 인도를 횡단하는 등 중과실 책임이 있긴 하나, 사전에 피진정인이 복무관리기관의 장으로서 주의 의무를 다하였다면 진정인이 교통사고에 이르지 않았을 수 있고, 20대 초반의 사회초년생이 벌금을 비롯해 고액의 손해배상 부담을 지는 상황을 피할 수 있었을 것임. 2. 사회복무요원은 병역의무를 이행하고자 국가기관, 지방자치단체 등에 소속되어 공익목적 업무 수행 차원에서 사회서비스 및 행정업무에 복무하고 있는 자이고, 「병역법」 제31조에서도 사회복무요원의 직무상 행위를 공무수행으로 규정하고 있는 바, 공무수행 중 발생한 사고에 대하여 사회복무요원을 공무원 등과 달리 대우할 합리적인 이유가 없음. 따라서 법적 지원체계 마련이 필요함. 3. 진정 취지가 구상권을 행사하지 않도록 해달라는 요청이므로, 이는 헌법상 권리 침해 여부를 판단할 사안이 아님.
결론:
주문 1 : 1. 광명시장에게, 안전사고 위험이 큰 분야에 소속 사회복무요원을 가급적 배치하지 않도록 하고, 배치가 불가피할 때에는 사전에 안전교육을 철저히 실시하도록 의견을 표명합니다. 주문 2 : 2. 병무청장에게, 사회복무요원의 복무 관련 사고 발생 시 초기단계부터 지원하고 민형사상 부담을 최소화하는 지원체계를 마련하도록 의견을 표명합니다. 주문 3 : 3. 이 사건 진정은 각하합니다.
위를 참고한 새로운 질문과 답변
질문: 사회복무요원이 공무 중 발생한 교통사고로 인한 피해자에 대한 구상권을 행사할 수 있는지에 대한 법적 근거가 어떻게 되는지 알려주세요.
답변: 사회복무요원이 공무 중 발생한 교통사고로 인한 피해자에 대한 구상권 여부는 법적으로 인정됩니다. 이는 병역법 제31조에 따라 사회복무요원의 직무를 공무수행으로 규정하고 있으며, 중과실 책임이 있더라도 사전에 안전 조치를 취하거나 교통법규를 준수해 사고를 예방할 의무가 있음을 의미합니다. 따라서 피해자는 피사회복무요원에 대한 손해배상을 청구할 수 있습니다.
*****
요청:
{}
근거:
{}
결론:
{}
위를 참고한 새로운 질문과 답변
질문:
```
## 인용
```
@inproceedings{song2023},
author = {송영숙 and 심상진 and 김성현},
title = {대화형 생성 모델을 위한 인권 코퍼스 구축},
booktitle = {한글 및 한국어 정보처리 학술대회 발표 예정)},
year = {2023},
publisher = {한글 및 한국어 정보처리 학회}
}
```
|
Mindofmachine/paul_graham_and_sam_altman_articles | 2023-10-10T05:56:13.000Z | [
"region:us"
] | Mindofmachine | null | null | null | 0 | 0 | |
SRGui/autotrain-data-tete | 2023-10-10T06:00:31.000Z | [
"region:us"
] | SRGui | null | null | null | 0 | 0 | Entry not found |
casey-martin/qald_9_plus | 2023-10-10T06:06:19.000Z | [
"task_categories:table-question-answering",
"task_categories:text2text-generation",
"language:ba",
"language:be",
"language:de",
"language:en",
"language:fr",
"language:hy",
"language:lt",
"language:ru",
"language:uk",
"license:cc-by-4.0",
"semantic web",
"sparql",
"wikidata",
"dbpedia",
"arxiv:2202.00120",
"region:us"
] | casey-martin | null | null | null | 0 | 0 | ---
license: cc-by-4.0
task_categories:
- table-question-answering
- text2text-generation
language:
- ba
- be
- de
- en
- fr
- hy
- lt
- ru
- uk
tags:
- semantic web
- sparql
- wikidata
- dbpedia
pretty_name: QALD 9+
---
# QALD-9-plus Dataset Description
QALD-9-plus is the dataset for Knowledge Graph Question Answering (KGQA) based on well-known [QALD-9](https://github.com/ag-sc/QALD/tree/master/9/data).
QALD-9-plus enables to train and test KGQA systems over DBpedia and Wikidata using questions in 9 different languages: English, German, Russian, French, Armenian, Belarusian, Lithuanian, Bashkir, and Ukrainian.
Some of the questions have several alternative writings in particular languages which enables to evaluate the robustness of KGQA systems and train paraphrasing models.
As the questions' translations were provided by native speakers, they are considered as "gold standard", therefore, machine translation tools can be trained and evaluated on the dataset.
# Dataset Statistics
| | en | de | fr | ru | uk | lt | be | ba | hy | # questions DBpedia | # questions Wikidata |
|-------|:---:|:---:|:--:|:----:|:---:|:---:|:---:|:---:|:--:|:-----------:|:-----------:|
| Train | 408 | 543 | 260 | 1203 | 447 | 468 | 441 | 284 | 80 | 408 | 371 |
| Test | 150 | 176 | 26 | 348 | 176 | 186 | 155 | 117 | 20 | 150 | 136 |
Given the numbers, it is obvious that some of the languages are covered more than once i.e., there is more than one translation for a particular question.
For example, there are 1203 Russian translations available while only 408 unique questions exist in the training subset (i.e., 2.9 Russian translations per one question).
The availability of such parallel corpora enables the researchers, developers and other dataset users to address the paraphrasing task.
# Evaluation
We used [GERBIL](https://github.com/dice-group/gerbil/) system for the evaluation of the dataset. The detailed information for the experiments is available at the individual link (click the value in the cells).
## Wikidata
### QAnswer
| | en | de | ru | fr |
|-----|----|----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110010001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180000)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180002)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110010007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180008)|
### DeepPavlov
| | en | ru |
|-----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110080010)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180003)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110090001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180009)|
### Platypus
| | en | fr |
|-----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110110004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180004)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110110006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180010)|
## DBpedia
### QAnswer
| | en | de | ru | fr |
|-----|----|----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110120004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190000)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190002)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110130002)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190003)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190005)|
## Wikidata Original Translations
### QAnswer
| | de | ru | fr |
|-----|----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190008)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190009)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190010)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190011)|
### DeepPavlov
| | ru |
|-----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190012)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190014)|
### Platypus
| | fr |
|-----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190013)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190015)|
## DBpedia Original Translations
### QAnswer
| | de | ru | fr |
|-----|----|----|----|
|Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190016)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190017)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190018)|
|Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190019)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190020)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190021)|
# Cite
```bibtex
@inproceedings{perevalov2022qald9plus,
author={Perevalov, Aleksandr and Diefenbach, Dennis and Usbeck, Ricardo and Both, Andreas},
booktitle={2022 IEEE 16th International Conference on Semantic Computing (ICSC)},
title={QALD-9-plus: A Multilingual Dataset for Question Answering over DBpedia and Wikidata Translated by Native Speakers},
year={2022},
pages={229-234},
doi={10.1109/ICSC52841.2022.00045}
}
```
# Useful Links
* ArXiv [link](https://arxiv.org/abs/2202.00120)
* Papers with Code: [Paper](https://paperswithcode.com/paper/qald-9-plus-a-multilingual-dataset-for-1), [Dataset](https://paperswithcode.com/dataset/qald-9-plus)
* Video presentation on YouTube: https://youtu.be/W1w7CJTV48c
* Presentation [slides](https://drive.google.com/file/d/1cDphq4DeSiZr-WBvdwu34rcxQ0aP4q95/view?usp=sharing)
* Google Colab [notebook](https://colab.research.google.com/drive/1eWsQoIaeT9_vii1v3PVU04Rms4EoyLAh?usp=sharing)
# Licence [![CC BY 4.0][cc-by-shield]][cc-by]
This work is licensed under a
[Creative Commons Attribution 4.0 International License][cc-by].
[![CC BY 4.0][cc-by-image]][cc-by]
[cc-by]: http://creativecommons.org/licenses/by/4.0/
[cc-by-image]: https://i.creativecommons.org/l/by/4.0/88x31.png
[cc-by-shield]: https://img.shields.io/badge/License-CC%20BY%204.0-lightgrey.svg
# Dataset Metadata
The following table is necessary for this dataset to be indexed by search
engines such as <a href="https://g.co/datasetsearch">Google Dataset Search</a>.
<div itemscope itemtype="http://schema.org/Dataset">
<table>
<tr>
<th>property</th>
<th>value</th>
</tr>
<tr>
<td>name</td>
<td><code itemprop="name">QALD-9-plus: A Multilingual Dataset for Question Answering over DBpedia and Wikidata Translated by Native Speakers</code></td>
</tr>
<tr>
<td>alternateName</td>
<td><code itemprop="alternateName">QALD-9-plus</code></td>
</tr>
<tr>
<td>url</td>
<td><code itemprop="url">https://github.com/Perevalov/qald_9_plus/tree/main/data</code></td>
</tr>
<tr>
<td>description</td>
<td><code itemprop="description">QALD-9-Plus is the dataset for Knowledge Graph Question Answering (KGQA) based on well-known QALD-9.<br/>
QALD-9-Plus enables to train and test KGQA systems over DBpedia and Wikidata using questions in 9 different languages: English, German, Russian, French, Armenian, Belarusian, Lithuanian, Bashkir, and Ukrainian.<br/>
Some of the questions have several alternative writings in particular languages which enables to evaluate the robustness of KGQA systems and train paraphrasing models.<br/>
As the questions' translations were provided by native speakers, they are considered as "gold standard", therefore, machine translation tools can be trained and evaluated on the dataset.</code></td>
</tr>
<tr>
<td>license</td>
<td>
<div itemscope itemtype="http://schema.org/CreativeWork" itemprop="license">
<table>
<tr>
<th>property</th>
<th>value</th>
</tr>
<tr>
<td>name</td>
<td><code itemprop="name">CC-BY-4.0</code></td>
</tr>
<tr>
<td>url</td>
<td><code itemprop="url">https://creativecommons.org/licenses/by/4.0/</code></td>
</tr>
</table>
</div>
</td>
</tr>
<tr>
<td>citation</td>
<td><code itemprop="citation">Perevalov, Aleksandr, Diefenbach, Diefenback, Usbeck, Ricardo, Both, Andreas: QALD-9-plus: A multilingual dataset for question answering over DBpedia and Wikidata translated by native speakers. In: 2022 IEEE 16th International Conference on Semantic Computing (ICSC). IEEE (2022)</code></td>
</tr>
</table>
</div> |
open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2 | 2023-10-10T06:05:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [krevas/LDCC-Instruct-Llama-2-ko-13B-v2](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T06:04:26.663902](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2/blob/main/results_2023-10-10T06-04-26.663902.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45958883488115343,\n\
\ \"acc_stderr\": 0.034511714778603424,\n \"acc_norm\": 0.4636864222606454,\n\
\ \"acc_norm_stderr\": 0.03449288105358144,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39776112473254976,\n\
\ \"mc2_stderr\": 0.013677730634490858\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6105357498506274,\n\
\ \"acc_stderr\": 0.004866322258335963,\n \"acc_norm\": 0.8181637124078869,\n\
\ \"acc_norm_stderr\": 0.0038492126228151717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389188,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389188\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489359,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489359\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5096774193548387,\n \"acc_stderr\": 0.02843867799890955,\n \"\
acc_norm\": 0.5096774193548387,\n \"acc_norm_stderr\": 0.02843867799890955\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.033403619062765864,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.033403619062765864\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.618348623853211,\n \"acc_stderr\": 0.02082814851702258,\n \"acc_norm\"\
: 0.618348623853211,\n \"acc_norm_stderr\": 0.02082814851702258\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n\
\ \"acc_stderr\": 0.028139689444859672,\n \"acc_norm\": 0.2175925925925926,\n\
\ \"acc_norm_stderr\": 0.028139689444859672\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247273,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247273\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356462,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356462\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.038258255488486076,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.038258255488486076\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914425,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369295,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369295\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637792,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966337,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966337\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576063,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576063\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963768,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963768\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37157757496740546,\n\
\ \"acc_stderr\": 0.012341828514528285,\n \"acc_norm\": 0.37157757496740546,\n\
\ \"acc_norm_stderr\": 0.012341828514528285\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48366013071895425,\n \"acc_stderr\": 0.02021703065318646,\n \
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.03038726291954773,\n\
\ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.03038726291954773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39776112473254976,\n\
\ \"mc2_stderr\": 0.013677730634490858\n }\n}\n```"
repo_url: https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-04-26.663902.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-04-26.663902.parquet'
- config_name: results
data_files:
- split: 2023_10_10T06_04_26.663902
path:
- results_2023-10-10T06-04-26.663902.parquet
- split: latest
path:
- results_2023-10-10T06-04-26.663902.parquet
---
# Dataset Card for Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [krevas/LDCC-Instruct-Llama-2-ko-13B-v2](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T06:04:26.663902](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2/blob/main/results_2023-10-10T06-04-26.663902.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45958883488115343,
"acc_stderr": 0.034511714778603424,
"acc_norm": 0.4636864222606454,
"acc_norm_stderr": 0.03449288105358144,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.39776112473254976,
"mc2_stderr": 0.013677730634490858
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5639931740614335,
"acc_norm_stderr": 0.014491225699230916
},
"harness|hellaswag|10": {
"acc": 0.6105357498506274,
"acc_stderr": 0.004866322258335963,
"acc_norm": 0.8181637124078869,
"acc_norm_stderr": 0.0038492126228151717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389188,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389188
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489359,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489359
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0220190800122179,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0220190800122179
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.033403619062765864,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.033403619062765864
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514565,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.02082814851702258,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.02082814851702258
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859672,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247273,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247273
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.033408675019233246,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.033408675019233246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356462,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356462
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.038258255488486076,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.038258255488486076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914425,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369295,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369295
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966337,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966337
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576063,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576063
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668777,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668777
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963768,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963768
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37157757496740546,
"acc_stderr": 0.012341828514528285,
"acc_norm": 0.37157757496740546,
"acc_norm_stderr": 0.012341828514528285
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.34285714285714286,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.34285714285714286,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.03400598505599014,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.03400598505599014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.39776112473254976,
"mc2_stderr": 0.013677730634490858
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FinGPT/fingpt-sentiment-train | 2023-10-10T06:28:24.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 18860715
num_examples: 76772
download_size: 6417302
dataset_size: 18860715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fingpt-sentiment-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-headline | 2023-10-10T06:31:55.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 13343930
num_examples: 82161
- name: test
num_bytes: 3339415
num_examples: 20547
download_size: 647377
dataset_size: 16683345
---
# Dataset Card for "fingpt-headline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-ner | 2023-10-10T06:33:43.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 241523
num_examples: 511
- name: test
num_bytes: 63634
num_examples: 98
download_size: 105426
dataset_size: 305157
---
# Dataset Card for "fingpt-ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tinhpx2911/book_data | 2023-10-10T07:07:41.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | Entry not found |
FinGPT/fingpt-finred-re | 2023-10-10T06:40:16.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 11144078
num_examples: 11400
- name: test
num_bytes: 2076314
num_examples: 2136
download_size: 1290513
dataset_size: 13220392
---
# Dataset Card for "fingpt-finred-re"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-convfinqa | 2023-10-10T06:44:37.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 52762154
num_examples: 11104
- name: test
num_bytes: 6733552
num_examples: 1490
download_size: 10979923
dataset_size: 59495706
---
# Dataset Card for "fingpt-convfinqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-fiqa_qa | 2023-10-10T06:51:12.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 20914549
num_examples: 17110
download_size: 10813846
dataset_size: 20914549
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fingpt-fiqa_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1 | 2023-10-10T06:40:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mistralai/Mistral-7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T06:38:48.353025](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-10T06-38-48.353025.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5534994306638509,\n\
\ \"acc_stderr\": 0.03475700070795008,\n \"acc_norm\": 0.5570434858760736,\n\
\ \"acc_norm_stderr\": 0.03474510896674971,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5628382292113293,\n\
\ \"mc2_stderr\": 0.015351892312006444\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358269,\n\
\ \"acc_norm\": 0.5452218430034129,\n \"acc_norm_stderr\": 0.014551507060836357\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5694084843656642,\n\
\ \"acc_stderr\": 0.004941470620074867,\n \"acc_norm\": 0.7563234415455089,\n\
\ \"acc_norm_stderr\": 0.0042842240337755385\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.030325945789286105,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.030325945789286105\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273956,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273956\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699947,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699947\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300642,\n \"\
acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300642\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.032339434681820885,\n\
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.032339434681820885\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"\
acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115072,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115072\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.0299366963871386,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.0299366963871386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806299,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806299\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560392,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560392\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7432950191570882,\n\
\ \"acc_stderr\": 0.015620480263064533,\n \"acc_norm\": 0.7432950191570882,\n\
\ \"acc_norm_stderr\": 0.015620480263064533\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098417,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098417\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829027,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829027\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647012,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871595,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871595\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40091264667535853,\n\
\ \"acc_stderr\": 0.012516960350640824,\n \"acc_norm\": 0.40091264667535853,\n\
\ \"acc_norm_stderr\": 0.012516960350640824\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5628382292113293,\n\
\ \"mc2_stderr\": 0.015351892312006444\n }\n}\n```"
repo_url: https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet'
- config_name: results
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- results_2023-10-10T06-38-48.353025.parquet
- split: latest
path:
- results_2023-10-10T06-38-48.353025.parquet
---
# Dataset Card for Evaluation run of mistralai/Mistral-7B-Instruct-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T06:38:48.353025](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-10T06-38-48.353025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5534994306638509,
"acc_stderr": 0.03475700070795008,
"acc_norm": 0.5570434858760736,
"acc_norm_stderr": 0.03474510896674971,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5628382292113293,
"mc2_stderr": 0.015351892312006444
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.014595873205358269,
"acc_norm": 0.5452218430034129,
"acc_norm_stderr": 0.014551507060836357
},
"harness|hellaswag|10": {
"acc": 0.5694084843656642,
"acc_stderr": 0.004941470620074867,
"acc_norm": 0.7563234415455089,
"acc_norm_stderr": 0.0042842240337755385
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273956,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273956
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699947,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115072,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115072
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.0299366963871386,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.0299366963871386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806299,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806299
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560392,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560392
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7432950191570882,
"acc_stderr": 0.015620480263064533,
"acc_norm": 0.7432950191570882,
"acc_norm_stderr": 0.015620480263064533
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098417,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098417
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829027,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647012,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871595,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40091264667535853,
"acc_stderr": 0.012516960350640824,
"acc_norm": 0.40091264667535853,
"acc_norm_stderr": 0.012516960350640824
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5628382292113293,
"mc2_stderr": 0.015351892312006444
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FinGPT/fingpt-headline-cls | 2023-10-10T06:47:59.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 12571278
num_examples: 82161
- name: test
num_bytes: 3147768
num_examples: 20547
download_size: 986960
dataset_size: 15719046
---
# Dataset Card for "fingpt-headline-cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-sentiment-cls | 2023-10-10T06:49:38.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 10908696
num_examples: 47557
download_size: 3902114
dataset_size: 10908696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fingpt-sentiment-cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-ner-cls | 2023-10-10T06:42:34.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 5730497
num_examples: 13549
- name: test
num_bytes: 2112011
num_examples: 3502
download_size: 298810
dataset_size: 7842508
---
# Dataset Card for "fingpt-ner-cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-finred-cls | 2023-10-10T06:41:54.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 23991756
num_examples: 48474
- name: test
num_bytes: 3899700
num_examples: 8928
download_size: 2897823
dataset_size: 27891456
---
# Dataset Card for "fingpt-finred-cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-fineval | 2023-10-10T06:45:52.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 441991
num_examples: 1056
- name: test
num_bytes: 117516
num_examples: 265
download_size: 269193
dataset_size: 559507
---
# Dataset Card for "fingpt-fineval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pbaoo2705/cpgqa_processed_eval | 2023-10-10T06:53:21.000Z | [
"region:us"
] | pbaoo2705 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: title
dtype: string
- name: id
dtype: int64
- name: question
dtype: string
- name: answer_text
dtype: string
- name: answer_start
dtype: int64
- name: context
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: answer
dtype: string
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: validation
num_bytes: 1212109
num_examples: 104
download_size: 35223
dataset_size: 1212109
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "cpgqa_processed_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-finred | 2023-10-10T06:58:37.000Z | [
"region:us"
] | FinGPT | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 25113554
num_examples: 27558
- name: test
num_bytes: 4477146
num_examples: 5112
download_size: 2114835
dataset_size: 29590700
---
# Dataset Card for "fingpt-finred"
This dataset consist of both Relation Extraction part and Classification part, and it used in Multi-task Instruction Tuning
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct | 2023-10-10T06:58:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of maywell/Synatra-V0.1-7B-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/Synatra-V0.1-7B-Instruct](https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T06:57:04.221099](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct/blob/main/results_2023-10-10T06-57-04.221099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5526128967911348,\n\
\ \"acc_stderr\": 0.03462386322845972,\n \"acc_norm\": 0.5565308458707837,\n\
\ \"acc_norm_stderr\": 0.0346104765546302,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\
\ \"mc2_stderr\": 0.015250255723495946\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490975,\n\
\ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5701055566620196,\n\
\ \"acc_stderr\": 0.004940490508240653,\n \"acc_norm\": 0.7662816172077276,\n\
\ \"acc_norm_stderr\": 0.004223302177263008\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296736,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\
: 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"\
acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\
\ \"acc_stderr\": 0.01574549716904904,\n \"acc_norm\": 0.7369093231162197,\n\
\ \"acc_norm_stderr\": 0.01574549716904904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497717,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614098,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\
\ \"acc_stderr\": 0.012520315120147103,\n \"acc_norm\": 0.4015645371577575,\n\
\ \"acc_norm_stderr\": 0.012520315120147103\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\
\ \"mc2_stderr\": 0.015250255723495946\n }\n}\n```"
repo_url: https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-57-04.221099.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-57-04.221099.parquet'
- config_name: results
data_files:
- split: 2023_10_10T06_57_04.221099
path:
- results_2023-10-10T06-57-04.221099.parquet
- split: latest
path:
- results_2023-10-10T06-57-04.221099.parquet
---
# Dataset Card for Evaluation run of maywell/Synatra-V0.1-7B-Instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [maywell/Synatra-V0.1-7B-Instruct](https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T06:57:04.221099](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct/blob/main/results_2023-10-10T06-57-04.221099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5526128967911348,
"acc_stderr": 0.03462386322845972,
"acc_norm": 0.5565308458707837,
"acc_norm_stderr": 0.0346104765546302,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431445,
"mc2": 0.557562665558094,
"mc2_stderr": 0.015250255723495946
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490975,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526848
},
"harness|hellaswag|10": {
"acc": 0.5701055566620196,
"acc_stderr": 0.004940490508240653,
"acc_norm": 0.7662816172077276,
"acc_norm_stderr": 0.004223302177263008
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.034524539038220406,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.034524539038220406
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296736,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7369093231162197,
"acc_stderr": 0.01574549716904904,
"acc_norm": 0.7369093231162197,
"acc_norm_stderr": 0.01574549716904904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497717,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.02755994980234782,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.02755994980234782
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.012520315120147103,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.012520315120147103
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492523,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492523
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431445,
"mc2": 0.557562665558094,
"mc2_stderr": 0.015250255723495946
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30 | 2023-10-10T07:02:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T07:01:15.573690](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-10T07-01-15.573690.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6173946376864352,\n\
\ \"acc_stderr\": 0.03337871209335492,\n \"acc_norm\": 0.6210628259045844,\n\
\ \"acc_norm_stderr\": 0.03336880945880684,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.47014420938426915,\n\
\ \"mc2_stderr\": 0.014571966148559557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370053,\n\
\ \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5524795857398924,\n\
\ \"acc_stderr\": 0.0049622205125483525,\n \"acc_norm\": 0.739892451702848,\n\
\ \"acc_norm_stderr\": 0.004377965074211627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118634,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382175,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"\
acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"\
acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.042450224863844956,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.042450224863844956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.015445716910998874,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.015445716910998874\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826514,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826514\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553313,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412243,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412243\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.47014420938426915,\n\
\ \"mc2_stderr\": 0.014571966148559557\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- results_2023-10-10T07-01-15.573690.parquet
- split: latest
path:
- results_2023-10-10T07-01-15.573690.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T07:01:15.573690](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-10T07-01-15.573690.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6173946376864352,
"acc_stderr": 0.03337871209335492,
"acc_norm": 0.6210628259045844,
"acc_norm_stderr": 0.03336880945880684,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.47014420938426915,
"mc2_stderr": 0.014571966148559557
},
"harness|arc:challenge|25": {
"acc": 0.4854948805460751,
"acc_stderr": 0.014605241081370053,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5524795857398924,
"acc_stderr": 0.0049622205125483525,
"acc_norm": 0.739892451702848,
"acc_norm_stderr": 0.004377965074211627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118634,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382175,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.042450224863844956,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.042450224863844956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998874,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998874
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826514,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826514
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553313,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412243,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412243
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.47014420938426915,
"mc2_stderr": 0.014571966148559557
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
crumb/textbook-codex | 2023-10-10T12:01:25.000Z | [
"region:us"
] | crumb | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: src
dtype: string
- name: src_col
dtype: string
splits:
- name: train
num_bytes: 11980410738.0
num_examples: 3590209
download_size: 5623419837
dataset_size: 11980410738.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "textbook-codex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
4n3mone/test | 2023-10-10T07:24:07.000Z | [
"license:mit",
"region:us"
] | 4n3mone | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied | 2023-10-10T07:27:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T07:25:43.126145](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-10T07-25-43.126145.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5471861799249809,\n\
\ \"acc_stderr\": 0.034615509927687124,\n \"acc_norm\": 0.5508249838346596,\n\
\ \"acc_norm_stderr\": 0.034606338203415,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.37535380227171294,\n\
\ \"mc2_stderr\": 0.013767926078311071\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4718430034129693,\n \"acc_stderr\": 0.014588204105102203,\n\
\ \"acc_norm\": 0.49573378839590443,\n \"acc_norm_stderr\": 0.014610858923956952\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.54371639115714,\n \
\ \"acc_stderr\": 0.004970672651595851,\n \"acc_norm\": 0.73451503684525,\n\
\ \"acc_norm_stderr\": 0.004406886100685868\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308767,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308767\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178267,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178267\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.01890416417151018,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.01890416417151018\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\"\
: 0.6813725490196079,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"\
acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.026424816594009845,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.026424816594009845\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.01617569201338197,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.01617569201338197\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028582,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028582\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380157,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563664,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734576,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.37535380227171294,\n\
\ \"mc2_stderr\": 0.013767926078311071\n }\n}\n```"
repo_url: https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- results_2023-10-10T07-25-43.126145.parquet
- split: latest
path:
- results_2023-10-10T07-25-43.126145.parquet
---
# Dataset Card for Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T07:25:43.126145](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-10T07-25-43.126145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5471861799249809,
"acc_stderr": 0.034615509927687124,
"acc_norm": 0.5508249838346596,
"acc_norm_stderr": 0.034606338203415,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.37535380227171294,
"mc2_stderr": 0.013767926078311071
},
"harness|arc:challenge|25": {
"acc": 0.4718430034129693,
"acc_stderr": 0.014588204105102203,
"acc_norm": 0.49573378839590443,
"acc_norm_stderr": 0.014610858923956952
},
"harness|hellaswag|10": {
"acc": 0.54371639115714,
"acc_stderr": 0.004970672651595851,
"acc_norm": 0.73451503684525,
"acc_norm_stderr": 0.004406886100685868
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308767,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178267,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178267
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094528,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094528
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.01890416417151018,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.01890416417151018
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.026424816594009845,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.026424816594009845
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.01617569201338197,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.01617569201338197
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028582,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028582
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380157,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563664,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734576,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.37535380227171294,
"mc2_stderr": 0.013767926078311071
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied | 2023-10-10T07:32:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of hiyouga/Baichuan2-7B-Chat-LLaMAfied
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hiyouga/Baichuan2-7B-Chat-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T07:31:02.024016](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied/blob/main/results_2023-10-10T07-31-02.024016.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.538186377170592,\n\
\ \"acc_stderr\": 0.03456669209691277,\n \"acc_norm\": 0.542012953205552,\n\
\ \"acc_norm_stderr\": 0.03455639779543999,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815505,\n \"mc2\": 0.48040516908972264,\n\
\ \"mc2_stderr\": 0.015522606282564484\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5478988249352719,\n\
\ \"acc_stderr\": 0.004966832553245048,\n \"acc_norm\": 0.740390360485959,\n\
\ \"acc_norm_stderr\": 0.00437524423704513\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n\
\ \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374766,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374766\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.02413015829976262,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.02413015829976262\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943238,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7339449541284404,\n\
\ \"acc_stderr\": 0.018946022322225593,\n \"acc_norm\": 0.7339449541284404,\n\
\ \"acc_norm_stderr\": 0.018946022322225593\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.01549108895149458,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.01549108895149458\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.026589231142174263,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.026589231142174263\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966339,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966339\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722338,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722338\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \
\ \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815505,\n \"mc2\": 0.48040516908972264,\n\
\ \"mc2_stderr\": 0.015522606282564484\n }\n}\n```"
repo_url: https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-31-02.024016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-31-02.024016.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_31_02.024016
path:
- results_2023-10-10T07-31-02.024016.parquet
- split: latest
path:
- results_2023-10-10T07-31-02.024016.parquet
---
# Dataset Card for Evaluation run of hiyouga/Baichuan2-7B-Chat-LLaMAfied
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hiyouga/Baichuan2-7B-Chat-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T07:31:02.024016](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied/blob/main/results_2023-10-10T07-31-02.024016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.538186377170592,
"acc_stderr": 0.03456669209691277,
"acc_norm": 0.542012953205552,
"acc_norm_stderr": 0.03455639779543999,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815505,
"mc2": 0.48040516908972264,
"mc2_stderr": 0.015522606282564484
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.5478988249352719,
"acc_stderr": 0.004966832553245048,
"acc_norm": 0.740390360485959,
"acc_norm_stderr": 0.00437524423704513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.0433643270799318,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.0433643270799318
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374766,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374766
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.02413015829976262,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.02413015829976262
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943238,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.018946022322225593,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.018946022322225593
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648371,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.01549108895149458,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.01549108895149458
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.026589231142174263,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.026589231142174263
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966339,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966339
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722338,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722338
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815505,
"mc2": 0.48040516908972264,
"mc2_stderr": 0.015522606282564484
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat | 2023-10-10T07:41:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T07:39:47.100914](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat/blob/main/results_2023-10-10T07-39-47.100914.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44291345536273574,\n\
\ \"acc_stderr\": 0.03513140512866742,\n \"acc_norm\": 0.4461672418802564,\n\
\ \"acc_norm_stderr\": 0.03512502259107083,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.428667116433953,\n\
\ \"mc2_stderr\": 0.015095774970188642\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45733788395904434,\n \"acc_stderr\": 0.014558106543924068,\n\
\ \"acc_norm\": 0.4735494880546075,\n \"acc_norm_stderr\": 0.014590931358120172\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5238996215893248,\n\
\ \"acc_stderr\": 0.004984077906216098,\n \"acc_norm\": 0.6996614220274846,\n\
\ \"acc_norm_stderr\": 0.004574683373821049\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n\
\ \"acc_stderr\": 0.028358634859836935,\n \"acc_norm\": 0.5387096774193548,\n\
\ \"acc_norm_stderr\": 0.028358634859836935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.038592681420702615,\n\
\ \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.038592681420702615\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5656565656565656,\n \"acc_stderr\": 0.035315058793591834,\n \"\
acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.035315058793591834\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048574,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048574\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.025028610276710855,\n\
\ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.025028610276710855\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5724770642201835,\n \"acc_stderr\": 0.021210910204300437,\n \"\
acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.021210910204300437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.034924061041636124,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.034924061041636124\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370672,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370672\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6143039591315453,\n\
\ \"acc_stderr\": 0.017406476619212904,\n \"acc_norm\": 0.6143039591315453,\n\
\ \"acc_norm_stderr\": 0.017406476619212904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.026882643434022885,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.026882643434022885\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.028358956313423545,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.028358956313423545\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327242,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327242\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509317,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n\
\ \"acc_stderr\": 0.01218255231321517,\n \"acc_norm\": 0.3500651890482399,\n\
\ \"acc_norm_stderr\": 0.01218255231321517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031236,\n\
\ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031236\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42483660130718953,\n \"acc_stderr\": 0.019997973035458336,\n \
\ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.019997973035458336\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.03487558640462064,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.03487558640462064\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.037627386999170565,\n\
\ \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.037627386999170565\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.428667116433953,\n\
\ \"mc2_stderr\": 0.015095774970188642\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-39-47.100914.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-39-47.100914.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_39_47.100914
path:
- results_2023-10-10T07-39-47.100914.parquet
- split: latest
path:
- results_2023-10-10T07-39-47.100914.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T07:39:47.100914](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat/blob/main/results_2023-10-10T07-39-47.100914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44291345536273574,
"acc_stderr": 0.03513140512866742,
"acc_norm": 0.4461672418802564,
"acc_norm_stderr": 0.03512502259107083,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.428667116433953,
"mc2_stderr": 0.015095774970188642
},
"harness|arc:challenge|25": {
"acc": 0.45733788395904434,
"acc_stderr": 0.014558106543924068,
"acc_norm": 0.4735494880546075,
"acc_norm_stderr": 0.014590931358120172
},
"harness|hellaswag|10": {
"acc": 0.5238996215893248,
"acc_stderr": 0.004984077906216098,
"acc_norm": 0.6996614220274846,
"acc_norm_stderr": 0.004574683373821049
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421255,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421255
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.028358634859836935,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.028358634859836935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.038592681420702615,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.038592681420702615
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5656565656565656,
"acc_stderr": 0.035315058793591834,
"acc_norm": 0.5656565656565656,
"acc_norm_stderr": 0.035315058793591834
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987053,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987053
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.021210910204300437,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.021210910204300437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.034924061041636124,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.034924061041636124
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370672,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370672
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6143039591315453,
"acc_stderr": 0.017406476619212904,
"acc_norm": 0.6143039591315453,
"acc_norm_stderr": 0.017406476619212904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.026882643434022885,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.026882643434022885
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095268,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095268
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.028358956313423545,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.028358956313423545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327242,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327242
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509317,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3500651890482399,
"acc_stderr": 0.01218255231321517,
"acc_norm": 0.3500651890482399,
"acc_norm_stderr": 0.01218255231321517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.028332959514031236,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.028332959514031236
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.019997973035458336,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.019997973035458336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.03487558640462064,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.03487558640462064
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.037627386999170565,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.037627386999170565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.428667116433953,
"mc2_stderr": 0.015095774970188642
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mickylan2367/LoadingScriptPractice | 2023-10-11T00:59:08.000Z | [
"language:en",
"license:cc-by-sa-4.0",
"music",
"region:us"
] | mickylan2367 | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
---
* HuggingfaceのAPIを利用したloading Scriptを試すための練習リポジトリです。 |
ravivishwakarmauzio/finetuning_llama2 | 2023-10-10T09:09:07.000Z | [
"region:us"
] | ravivishwakarmauzio | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 0
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "finetuning_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial | 2023-10-10T08:04:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-timedial](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T08:03:27.841263](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial/blob/main/results_2023-10-10T08-03-27.841263.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5054146605277109,\n\
\ \"acc_stderr\": 0.03502228375345075,\n \"acc_norm\": 0.5094480730613852,\n\
\ \"acc_norm_stderr\": 0.03501001177620598,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.41601570347119143,\n\
\ \"mc2_stderr\": 0.014201339674377911\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48890784982935154,\n \"acc_stderr\": 0.014607794914013048,\n\
\ \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.014586776355294317\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5650268870742879,\n\
\ \"acc_stderr\": 0.004947402907996251,\n \"acc_norm\": 0.7628958374825732,\n\
\ \"acc_norm_stderr\": 0.004244374809273613\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.042407993275749255,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.042407993275749255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5483870967741935,\n\
\ \"acc_stderr\": 0.028310500348568392,\n \"acc_norm\": 0.5483870967741935,\n\
\ \"acc_norm_stderr\": 0.028310500348568392\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469113,\n \"\
acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469113\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484253,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.045077322787750874,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.045077322787750874\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041694,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041694\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.02742100729539292,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.02742100729539292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6807151979565773,\n\
\ \"acc_stderr\": 0.016671261749538722,\n \"acc_norm\": 0.6807151979565773,\n\
\ \"acc_norm_stderr\": 0.016671261749538722\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574908,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574908\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02835895631342355,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02835895631342355\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3748370273794003,\n\
\ \"acc_stderr\": 0.01236365246755192,\n \"acc_norm\": 0.3748370273794003,\n\
\ \"acc_norm_stderr\": 0.01236365246755192\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4803921568627451,\n \"acc_stderr\": 0.020212274976302954,\n \
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.020212274976302954\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824562,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824562\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.41601570347119143,\n\
\ \"mc2_stderr\": 0.014201339674377911\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-03-27.841263.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-03-27.841263.parquet'
- config_name: results
data_files:
- split: 2023_10_10T08_03_27.841263
path:
- results_2023-10-10T08-03-27.841263.parquet
- split: latest
path:
- results_2023-10-10T08-03-27.841263.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-timedial](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T08:03:27.841263](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial/blob/main/results_2023-10-10T08-03-27.841263.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5054146605277109,
"acc_stderr": 0.03502228375345075,
"acc_norm": 0.5094480730613852,
"acc_norm_stderr": 0.03501001177620598,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.41601570347119143,
"mc2_stderr": 0.014201339674377911
},
"harness|arc:challenge|25": {
"acc": 0.48890784982935154,
"acc_stderr": 0.014607794914013048,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.014586776355294317
},
"harness|hellaswag|10": {
"acc": 0.5650268870742879,
"acc_stderr": 0.004947402907996251,
"acc_norm": 0.7628958374825732,
"acc_norm_stderr": 0.004244374809273613
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.042407993275749255,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.042407993275749255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5483870967741935,
"acc_stderr": 0.028310500348568392,
"acc_norm": 0.5483870967741935,
"acc_norm_stderr": 0.028310500348568392
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6862385321100918,
"acc_stderr": 0.019894723341469113,
"acc_norm": 0.6862385321100918,
"acc_norm_stderr": 0.019894723341469113
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03283472056108561,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03283472056108561
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484253,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.045077322787750874,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.045077322787750874
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041694,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041694
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.02742100729539292,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.02742100729539292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6807151979565773,
"acc_stderr": 0.016671261749538722,
"acc_norm": 0.6807151979565773,
"acc_norm_stderr": 0.016671261749538722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574908,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574908
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02835895631342355,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02835895631342355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3748370273794003,
"acc_stderr": 0.01236365246755192,
"acc_norm": 0.3748370273794003,
"acc_norm_stderr": 0.01236365246755192
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.020212274976302954,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.020212274976302954
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824562,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824562
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.41601570347119143,
"mc2_stderr": 0.014201339674377911
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tinhpx2911/vi_general_2 | 2023-10-10T08:14:59.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1229380167
num_examples: 2767613
download_size: 651909940
dataset_size: 1229380167
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vi_general_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30 | 2023-10-10T08:18:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/Pwen-VL-Chat-20_30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Pwen-VL-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T08:17:20.929764](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30/blob/main/results_2023-10-10T08-17-20.929764.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614045523007456,\n\
\ \"acc_stderr\": 0.034472805150990236,\n \"acc_norm\": 0.5650409022375938,\n\
\ \"acc_norm_stderr\": 0.03446466967324352,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.014586776355294316,\n\
\ \"acc_norm\": 0.5017064846416383,\n \"acc_norm_stderr\": 0.01461130570505699\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5382393945429197,\n\
\ \"acc_stderr\": 0.004975167382061832,\n \"acc_norm\": 0.7220673172674766,\n\
\ \"acc_norm_stderr\": 0.004470644845242893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.043192236258113324,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.043192236258113324\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n\
\ \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n\
\ \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"\
acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572284,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087548,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087548\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-17-20.929764.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-17-20.929764.parquet'
- config_name: results
data_files:
- split: 2023_10_10T08_17_20.929764
path:
- results_2023-10-10T08-17-20.929764.parquet
- split: latest
path:
- results_2023-10-10T08-17-20.929764.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Pwen-VL-Chat-20_30
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-VL-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T08:17:20.929764](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30/blob/main/results_2023-10-10T08-17-20.929764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5614045523007456,
"acc_stderr": 0.034472805150990236,
"acc_norm": 0.5650409022375938,
"acc_norm_stderr": 0.03446466967324352,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
},
"harness|arc:challenge|25": {
"acc": 0.4709897610921502,
"acc_stderr": 0.014586776355294316,
"acc_norm": 0.5017064846416383,
"acc_norm_stderr": 0.01461130570505699
},
"harness|hellaswag|10": {
"acc": 0.5382393945429197,
"acc_stderr": 0.004975167382061832,
"acc_norm": 0.7220673172674766,
"acc_norm_stderr": 0.004470644845242893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.043192236258113324,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.043192236258113324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572284,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.01536686038639711,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.01536686038639711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087548,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087548
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
giovanni92/maildata | 2023-10-10T08:28:55.000Z | [
"license:mit",
"region:us"
] | giovanni92 | null | null | null | 0 | 0 | ---
license: mit
---
|
ShreeyaVenneti/8_SENTENCE_APPENDED | 2023-10-10T08:39:25.000Z | [
"region:us"
] | ShreeyaVenneti | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13 | 2023-10-10T08:33:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mistral-7b-v13](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T08:32:08.394718](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13/blob/main/results_2023-10-10T08-32-08.394718.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5620369791952703,\n\
\ \"acc_stderr\": 0.03447878252384387,\n \"acc_norm\": 0.5658548856783289,\n\
\ \"acc_norm_stderr\": 0.03446768877961296,\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5080876092024247,\n\
\ \"mc2_stderr\": 0.015136250982780014\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.014595873205358267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5572595100577574,\n\
\ \"acc_stderr\": 0.004956953917781314,\n \"acc_norm\": 0.7509460266879108,\n\
\ \"acc_norm_stderr\": 0.004315812968431585\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395948,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395948\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.01426555419233114,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.01426555419233114\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5080876092024247,\n\
\ \"mc2_stderr\": 0.015136250982780014\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-32-08.394718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T08-32-08.394718.parquet'
- config_name: results
data_files:
- split: 2023_10_10T08_32_08.394718
path:
- results_2023-10-10T08-32-08.394718.parquet
- split: latest
path:
- results_2023-10-10T08-32-08.394718.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v13](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T08:32:08.394718](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13/blob/main/results_2023-10-10T08-32-08.394718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5620369791952703,
"acc_stderr": 0.03447878252384387,
"acc_norm": 0.5658548856783289,
"acc_norm_stderr": 0.03446768877961296,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5080876092024247,
"mc2_stderr": 0.015136250982780014
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.014595873205358267
},
"harness|hellaswag|10": {
"acc": 0.5572595100577574,
"acc_stderr": 0.004956953917781314,
"acc_norm": 0.7509460266879108,
"acc_norm_stderr": 0.004315812968431585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395948,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395948
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.01426555419233114,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.01426555419233114
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5080876092024247,
"mc2_stderr": 0.015136250982780014
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
phanvancongthanh/pubchem_kinases | 2023-10-10T08:36:56.000Z | [
"region:us"
] | phanvancongthanh | null | null | null | 0 | 0 | Entry not found |
danieletdg/eCommerceQuery | 2023-10-10T08:38:25.000Z | [
"region:us"
] | danieletdg | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: entities
dtype: string
splits:
- name: train
num_bytes: 398956
num_examples: 3994
- name: test
num_bytes: 1597728
num_examples: 15980
download_size: 1007526
dataset_size: 1996684
---
# Dataset Card for "eCommerceQuery"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aloobun/mini-math23k-v1 | 2023-10-10T12:40:42.000Z | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | aloobun | null | null | null | 0 | 0 | ---
license: mit
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- text-generation
pretty_name: math
---
The mini-math23k-v1 dataset is composed of ~ 23,000 entries of data, from open datasets across the AI landscape, including:
- [TIGER-Lab/MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct)
- [Birchlabs/openai-prm800k-solutions-only](https://huggingface.co/datasets/Birchlabs/openai-prm800k-solutions-only)
Credits:
```
Birchlabs
```
```
@article{yue2023mammoth,
title={MAmmoTH: Building Math Generalist Models through Hybrid Instruction Tuning},
author={Xiang Yue, Xingwei Qu, Ge Zhang, Yao Fu, Wenhao Huang, Huan Sun, Yu Su, Wenhu Chen},
journal={arXiv preprint arXiv:2309.05653},
year={2023}
}
``` |
Shiveswarran/llm_instruction_code_v6 | 2023-10-10T10:09:38.000Z | [
"license:mit",
"region:us"
] | Shiveswarran | null | null | null | 0 | 0 | ---
license: mit
---
|
Shiveswarran/llm_instruction_code_v7 | 2023-10-10T09:03:40.000Z | [
"license:apache-2.0",
"region:us"
] | Shiveswarran | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
SparkExpedition/TicketClassificationData | 2023-10-10T08:59:01.000Z | [
"license:mit",
"region:us"
] | SparkExpedition | null | null | null | 0 | 0 | ---
license: mit
---
|
philipphager/baidu-ultr-590k | 2023-10-10T13:45:09.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | philipphager | Query-document vectors and clicks for the Baidu Unbiased Learning to Rank dataset used
at the WSDM23 cup. This dataset uses the winning BERT cross-encoder from Tencent
to compute query-document vectors (768 dims), mainly for ease of use and to enable
usage of simpler, smaller neural networks that are more common in ULTR research.
This dataset contains features for part-00000.gz of the Baidu dataset,
containing 589,824 queries and 6,271,536 documents. | @InProceedings{huggingface:dataset,
title = {baidu-ultr-590k},
author={Philipp Hager},
year={2023}
} | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
|
chanelcolgate/image-classification-yenthienviet | 2023-10-10T09:46:11.000Z | [
"license:mit",
"region:us"
] | chanelcolgate | This dataset contains all THIENVIET products images split in training,
validation and testing | null | null | 0 | 0 | ---
license: mit
---
|
c0m/123 | 2023-10-10T09:19:40.000Z | [
"region:us"
] | c0m | null | null | null | 0 | 0 | Entry not found |
shuttie/dadjokes | 2023-10-10T09:40:50.000Z | [
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | shuttie | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
---
# Dad Jokes dataset
This dataset is generated from the [Kaggle Reddit Dad Jokes](https://www.kaggle.com/datasets/oktayozturk010/reddit-dad-jokes) by [Oktay Ozturk](https://www.kaggle.com/oktayozturk010), with the following modifications:
* Only jokes with 5+ votes were sampled. Less upvoted jokes are too cringe.
* With a set of heuristics, each joke was split into two parts: base and the punchline.
## Format
The dataset is formatted as a CSV, and is split into train/test parts:
* train: 52000 samples
* test: 1400 samples
```csv
"question","response"
"I asked my priest how he gets holy water","He said it’s just regular water, he just boils the hell out of it"
"Life Hack: If you play My Chemical Romance loud enough in your yard","your grass will cut itself"
"Why did Mr. Potato Head get pulled over","He was baked"
"How did the Mexican John Wick taste his Burrito","He took Juan Lick"
```
## Usage
With a base/punchline split, this dataset can be used for a joke prediction task with any LLM.
## License
Apache 2.0. |
ShreeyaVenneti/1SENTENCE_APPENDED_AVG_SELFPROM | 2023-10-10T09:28:14.000Z | [
"region:us"
] | ShreeyaVenneti | null | null | null | 0 | 0 | Entry not found |
temasarkisov/EsportLogosV2_processed_V2 | 2023-10-10T09:28:56.000Z | [
"region:us"
] | temasarkisov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4561815.0
num_examples: 73
download_size: 4560462
dataset_size: 4561815.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "EsportLogosV2_processed_V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hzaatiti/horouf | 2023-10-10T09:29:23.000Z | [
"license:apache-2.0",
"region:us"
] | hzaatiti | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
dkabx/ai_info | 2023-10-10T09:31:04.000Z | [
"license:apache-2.0",
"region:us"
] | dkabx | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1 | 2023-10-10T09:31:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:30:33.515075](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1/blob/main/results_2023-10-10T09-30-33.515075.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.556748563406145,\n\
\ \"acc_stderr\": 0.03445289163708764,\n \"acc_norm\": 0.5608341055809688,\n\
\ \"acc_norm_stderr\": 0.034433797078382726,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087286,\n \"mc2\": 0.3814609933945348,\n\
\ \"mc2_stderr\": 0.013919121072172235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6141206930890261,\n\
\ \"acc_stderr\": 0.004858074013443995,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246597\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920935,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920935\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957536,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.01827257581023187,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.01827257581023187\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277902,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277902\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\
\ \"acc_stderr\": 0.016693154927383557,\n \"acc_norm\": 0.47039106145251397,\n\
\ \"acc_norm_stderr\": 0.016693154927383557\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.012623343757430018,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.012623343757430018\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5735294117647058,\n \"acc_stderr\": 0.020007912739359375,\n \
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.020007912739359375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087286,\n \"mc2\": 0.3814609933945348,\n\
\ \"mc2_stderr\": 0.013919121072172235\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-30-33.515075.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-30-33.515075.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_30_33.515075
path:
- results_2023-10-10T09-30-33.515075.parquet
- split: latest
path:
- results_2023-10-10T09-30-33.515075.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:30:33.515075](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1/blob/main/results_2023-10-10T09-30-33.515075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.556748563406145,
"acc_stderr": 0.03445289163708764,
"acc_norm": 0.5608341055809688,
"acc_norm_stderr": 0.034433797078382726,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087286,
"mc2": 0.3814609933945348,
"mc2_stderr": 0.013919121072172235
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298962,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128342
},
"harness|hellaswag|10": {
"acc": 0.6141206930890261,
"acc_stderr": 0.004858074013443995,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246597
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920935,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957536,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736236,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905713,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905713
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277902,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277902
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47039106145251397,
"acc_stderr": 0.016693154927383557,
"acc_norm": 0.47039106145251397,
"acc_norm_stderr": 0.016693154927383557
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430018,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.020007912739359375,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.020007912739359375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087286,
"mc2": 0.3814609933945348,
"mc2_stderr": 0.013919121072172235
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hanifabdlh/quac-lamini-instruction-indo-90k-100k | 2023-10-10T09:33:35.000Z | [
"region:us"
] | hanifabdlh | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: context
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: instruction_source
dtype: string
splits:
- name: train
num_bytes: 3818546
num_examples: 10000
download_size: 2124213
dataset_size: 3818546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "quac-lamini-instruction-indo-90k-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0 | 2023-10-10T09:37:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-code-mistral-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-code-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:35:40.611521](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0/blob/main/results_2023-10-10T09-35-40.611521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6290829151650942,\n\
\ \"acc_stderr\": 0.033174869303662674,\n \"acc_norm\": 0.6329582077451893,\n\
\ \"acc_norm_stderr\": 0.03315323833095972,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4789607258136594,\n\
\ \"mc2_stderr\": 0.014858060050825114\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868805,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6404102768372834,\n\
\ \"acc_stderr\": 0.004788994060654275,\n \"acc_norm\": 0.8374825731925911,\n\
\ \"acc_norm_stderr\": 0.0036817082825814575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.02436259969303108,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.02436259969303108\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.0242831405294673,\n \
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.0242831405294673\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922531,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922531\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984806,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984806\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n\
\ \"acc_stderr\": 0.012769704263117528,\n \"acc_norm\": 0.4954367666232073,\n\
\ \"acc_norm_stderr\": 0.012769704263117528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4789607258136594,\n\
\ \"mc2_stderr\": 0.014858060050825114\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-40.611521.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-40.611521.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_35_40.611521
path:
- results_2023-10-10T09-35-40.611521.parquet
- split: latest
path:
- results_2023-10-10T09-35-40.611521.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-code-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:35:40.611521](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0/blob/main/results_2023-10-10T09-35-40.611521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6290829151650942,
"acc_stderr": 0.033174869303662674,
"acc_norm": 0.6329582077451893,
"acc_norm_stderr": 0.03315323833095972,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4789607258136594,
"mc2_stderr": 0.014858060050825114
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868805,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467321
},
"harness|hellaswag|10": {
"acc": 0.6404102768372834,
"acc_stderr": 0.004788994060654275,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.0036817082825814575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.02436259969303108,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.02436259969303108
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.0242831405294673,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.0242831405294673
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922531,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922531
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984806,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984806
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117528,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4789607258136594,
"mc2_stderr": 0.014858060050825114
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down | 2023-10-10T09:37:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:35:55.043179](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-10T09-35-55.043179.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5592913999207071,\n\
\ \"acc_stderr\": 0.034408410267323906,\n \"acc_norm\": 0.5638703288864542,\n\
\ \"acc_norm_stderr\": 0.03438731035716268,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.01561651849721938,\n \"mc2\": 0.4026382452319607,\n\
\ \"mc2_stderr\": 0.014141767221043162\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5238907849829352,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6140211113324039,\n\
\ \"acc_stderr\": 0.004858306877874625,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.04062990784146667,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.04062990784146667\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934265,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934265\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.018272575810231874,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.018272575810231874\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905709,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905709\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208176,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208176\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290282,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994099,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313165,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313165\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.020027122784928547,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.020027122784928547\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458619,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458619\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.01561651849721938,\n \"mc2\": 0.4026382452319607,\n\
\ \"mc2_stderr\": 0.014141767221043162\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-55.043179.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-55.043179.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_35_55.043179
path:
- results_2023-10-10T09-35-55.043179.parquet
- split: latest
path:
- results_2023-10-10T09-35-55.043179.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:35:55.043179](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-10T09-35-55.043179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5592913999207071,
"acc_stderr": 0.034408410267323906,
"acc_norm": 0.5638703288864542,
"acc_norm_stderr": 0.03438731035716268,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.01561651849721938,
"mc2": 0.4026382452319607,
"mc2_stderr": 0.014141767221043162
},
"harness|arc:challenge|25": {
"acc": 0.5238907849829352,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6140211113324039,
"acc_stderr": 0.004858306877874625,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.04062990784146667,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.04062990784146667
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934265,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934265
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.018272575810231874,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.018272575810231874
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905709,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905709
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208176,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208176
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290282,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994099,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313165,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313165
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458619,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458619
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.01561651849721938,
"mc2": 0.4026382452319607,
"mc2_stderr": 0.014141767221043162
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-spaced-repetition/fsrs-dataset | 2023-10-10T11:42:28.000Z | [
"license:mit",
"region:us"
] | open-spaced-repetition | null | null | null | 0 | 0 | ---
license: mit
---
|
Barry30/lishiqing | 2023-10-10T09:40:24.000Z | [
"license:apache-2.0",
"region:us"
] | Barry30 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down | 2023-10-10T09:44:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:42:44.126959](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-10T09-42-44.126959.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5617778768156815,\n\
\ \"acc_stderr\": 0.03445885114240818,\n \"acc_norm\": 0.5662582205912376,\n\
\ \"acc_norm_stderr\": 0.03443794881009665,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.39511256126038685,\n\
\ \"mc2_stderr\": 0.014250958645489436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985996,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436176\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6153156741684923,\n\
\ \"acc_stderr\": 0.004855262903270803,\n \"acc_norm\": 0.8233419637522406,\n\
\ \"acc_norm_stderr\": 0.0038059961194403767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n\
\ \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.46808510638297873,\n\
\ \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n\
\ \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"\
acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724345,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724345\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.032586303838365555,\n \"\
acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.032586303838365555\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569506,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569506\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.01501688469853988,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.01501688469853988\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.016223533510365117,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.016223533510365117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023334,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192714,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616307,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616307\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.020027122784928544,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.020027122784928544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.39511256126038685,\n\
\ \"mc2_stderr\": 0.014250958645489436\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-42-44.126959.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-42-44.126959.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_42_44.126959
path:
- results_2023-10-10T09-42-44.126959.parquet
- split: latest
path:
- results_2023-10-10T09-42-44.126959.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:42:44.126959](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-10T09-42-44.126959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5617778768156815,
"acc_stderr": 0.03445885114240818,
"acc_norm": 0.5662582205912376,
"acc_norm_stderr": 0.03443794881009665,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.39511256126038685,
"mc2_stderr": 0.014250958645489436
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985996,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436176
},
"harness|hellaswag|10": {
"acc": 0.6153156741684923,
"acc_stderr": 0.004855262903270803,
"acc_norm": 0.8233419637522406,
"acc_norm_stderr": 0.0038059961194403767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.032586303838365555,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.032586303838365555
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.01501688469853988,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.01501688469853988
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.016223533510365117,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.016223533510365117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023334,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192714,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616307,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616307
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.020027122784928544,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.020027122784928544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.39511256126038685,
"mc2_stderr": 0.014250958645489436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rbel/companynames | 2023-10-10T13:05:44.000Z | [
"license:apache-2.0",
"region:us"
] | rbel | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Jagadeesh-ti/hr_v2 | 2023-10-10T09:48:49.000Z | [
"region:us"
] | Jagadeesh-ti | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o | 2023-10-10T09:50:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:48:52.263585](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-10T09-48-52.263585.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5464725734742947,\n\
\ \"acc_stderr\": 0.03459202404654624,\n \"acc_norm\": 0.5504484069341958,\n\
\ \"acc_norm_stderr\": 0.03457351610520732,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.37024725158560373,\n\
\ \"mc2_stderr\": 0.013684716913788187\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097665,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436178\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6088428599880502,\n\
\ \"acc_stderr\": 0.004870121051762738,\n \"acc_norm\": 0.8109938259310894,\n\
\ \"acc_norm_stderr\": 0.003907133818428084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376893,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.034028015813589656,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.034028015813589656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693268,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693268\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.02626167760780665,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.02626167760780665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737383,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419987,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.02014893942041574,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.02014893942041574\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235943,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235943\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.37024725158560373,\n\
\ \"mc2_stderr\": 0.013684716913788187\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- results_2023-10-10T09-48-52.263585.parquet
- split: latest
path:
- results_2023-10-10T09-48-52.263585.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:48:52.263585](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-10T09-48-52.263585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5464725734742947,
"acc_stderr": 0.03459202404654624,
"acc_norm": 0.5504484069341958,
"acc_norm_stderr": 0.03457351610520732,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.37024725158560373,
"mc2_stderr": 0.013684716913788187
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097665,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436178
},
"harness|hellaswag|10": {
"acc": 0.6088428599880502,
"acc_stderr": 0.004870121051762738,
"acc_norm": 0.8109938259310894,
"acc_norm_stderr": 0.003907133818428084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376893,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.034028015813589656,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.034028015813589656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.02626167760780665,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.02626167760780665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737383,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419987,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.02014893942041574,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.02014893942041574
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235943,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235943
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.37024725158560373,
"mc2_stderr": 0.013684716913788187
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
johannes-garstenauer/pooling_net_embeddings_dim_16_masked_dataset_1p | 2023-10-10T09:50:55.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 51148
num_examples: 673
download_size: 61004
dataset_size: 51148
---
# Dataset Card for "pooling_net_embeddings_dim_16_masked_dataset_1p"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o | 2023-10-10T09:57:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T09:55:39.074089](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o/blob/main/results_2023-10-10T09-55-39.074089.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5579208482981199,\n\
\ \"acc_stderr\": 0.03451179922986214,\n \"acc_norm\": 0.561910682652304,\n\
\ \"acc_norm_stderr\": 0.034493014919848415,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4152969798879882,\n\
\ \"mc2_stderr\": 0.014212723478778425\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924068,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650652\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6117307309300936,\n\
\ \"acc_stderr\": 0.004863603638367452,\n \"acc_norm\": 0.8172674765982872,\n\
\ \"acc_norm_stderr\": 0.00385657294683102\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.02983280811479601,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.02983280811479601\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011747,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011747\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502327,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502327\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419994,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419994\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040314,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494578,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087378,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087378\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4152969798879882,\n\
\ \"mc2_stderr\": 0.014212723478778425\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-55-39.074089.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-55-39.074089.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_55_39.074089
path:
- results_2023-10-10T09-55-39.074089.parquet
- split: latest
path:
- results_2023-10-10T09-55-39.074089.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T09:55:39.074089](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o/blob/main/results_2023-10-10T09-55-39.074089.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5579208482981199,
"acc_stderr": 0.03451179922986214,
"acc_norm": 0.561910682652304,
"acc_norm_stderr": 0.034493014919848415,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4152969798879882,
"mc2_stderr": 0.014212723478778425
},
"harness|arc:challenge|25": {
"acc": 0.5426621160409556,
"acc_stderr": 0.014558106543924068,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650652
},
"harness|hellaswag|10": {
"acc": 0.6117307309300936,
"acc_stderr": 0.004863603638367452,
"acc_norm": 0.8172674765982872,
"acc_norm_stderr": 0.00385657294683102
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.02983280811479601,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.02983280811479601
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011747,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011747
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502327,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502327
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419994,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419994
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040314,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494578,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087378,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087378
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271486,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4152969798879882,
"mc2_stderr": 0.014212723478778425
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Colin23189/kaggle-llm-v2 | 2023-10-10T10:04:36.000Z | [
"region:us"
] | Colin23189 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down | 2023-10-10T10:02:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T10:01:17.783068](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-10T10-01-17.783068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5536374505804326,\n\
\ \"acc_stderr\": 0.03447284896872045,\n \"acc_norm\": 0.5578844883616024,\n\
\ \"acc_norm_stderr\": 0.03445371982395855,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.39816555401276316,\n\
\ \"mc2_stderr\": 0.013926626868914686\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536597,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.01451268252312834\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6113324039036049,\n\
\ \"acc_stderr\": 0.004864513262194312,\n \"acc_norm\": 0.8209520015933081,\n\
\ \"acc_norm_stderr\": 0.0038260895866500575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983053,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983053\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.027430866579973463,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.027430866579973463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493389,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493389\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633142,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633142\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\"\
: 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n\
\ \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n\
\ \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n\
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395964,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395964\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101026,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101026\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.02743162372241501,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.02743162372241501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455328,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455328\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.029812630701569743,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.029812630701569743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.39816555401276316,\n\
\ \"mc2_stderr\": 0.013926626868914686\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- results_2023-10-10T10-01-17.783068.parquet
- split: latest
path:
- results_2023-10-10T10-01-17.783068.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T10:01:17.783068](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-10T10-01-17.783068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5536374505804326,
"acc_stderr": 0.03447284896872045,
"acc_norm": 0.5578844883616024,
"acc_norm_stderr": 0.03445371982395855,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.39816555401276316,
"mc2_stderr": 0.013926626868914686
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536597,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.01451268252312834
},
"harness|hellaswag|10": {
"acc": 0.6113324039036049,
"acc_stderr": 0.004864513262194312,
"acc_norm": 0.8209520015933081,
"acc_norm_stderr": 0.0038260895866500575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983053,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983053
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973463,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973463
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493389,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493389
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633142,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633142
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395964,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395964
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101026,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101026
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.02743162372241501,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.02743162372241501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455328,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455328
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.39816555401276316,
"mc2_stderr": 0.013926626868914686
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tinhpx2911/vi_general_1 | 2023-10-10T10:05:26.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_elinas__chronos007-70b | 2023-10-10T10:10:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elinas/chronos007-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos007-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T10:08:50.772021](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b/blob/main/results_2023-10-10T10-08-50.772021.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6924704612932385,\n\
\ \"acc_stderr\": 0.031262676706071496,\n \"acc_norm\": 0.6964780207046983,\n\
\ \"acc_norm_stderr\": 0.03123103152479671,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5765003665263857,\n\
\ \"mc2_stderr\": 0.0150600091771299\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620453,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.01337407861506874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6874128659629556,\n\
\ \"acc_stderr\": 0.004626002828389176,\n \"acc_norm\": 0.8752240589524,\n\
\ \"acc_norm_stderr\": 0.003297893047728379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400492,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400492\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n\
\ \"acc_stderr\": 0.021090847745939306,\n \"acc_norm\": 0.8354838709677419,\n\
\ \"acc_norm_stderr\": 0.021090847745939306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.02788682807838055,\n \
\ \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.02788682807838055\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.01380278022737736,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.01380278022737736\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \
\ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515368,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515368\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8480204342273308,\n\
\ \"acc_stderr\": 0.012837852506645216,\n \"acc_norm\": 0.8480204342273308,\n\
\ \"acc_norm_stderr\": 0.012837852506645216\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5508379888268157,\n\
\ \"acc_stderr\": 0.01663583834163193,\n \"acc_norm\": 0.5508379888268157,\n\
\ \"acc_norm_stderr\": 0.01663583834163193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396154,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236834,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236834\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5417209908735332,\n\
\ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.5417209908735332,\n\
\ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n\
\ \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n\
\ \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n\
\ \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n\
\ \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n\
\ \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n\
\ \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n\
\ \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n\
\ \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n\
\ \"mc2\": 0.5765003665263857,\n \"mc2_stderr\": 0.0150600091771299\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos007-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-08-50.772021.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-08-50.772021.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_08_50.772021
path:
- results_2023-10-10T10-08-50.772021.parquet
- split: latest
path:
- results_2023-10-10T10-08-50.772021.parquet
---
# Dataset Card for Evaluation run of elinas/chronos007-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elinas/chronos007-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos007-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T10:08:50.772021](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b/blob/main/results_2023-10-10T10-08-50.772021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6924704612932385,
"acc_stderr": 0.031262676706071496,
"acc_norm": 0.6964780207046983,
"acc_norm_stderr": 0.03123103152479671,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5765003665263857,
"mc2_stderr": 0.0150600091771299
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620453,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.01337407861506874
},
"harness|hellaswag|10": {
"acc": 0.6874128659629556,
"acc_stderr": 0.004626002828389176,
"acc_norm": 0.8752240589524,
"acc_norm_stderr": 0.003297893047728379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400492,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400492
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.021090847745939306,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.021090847745939306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7563025210084033,
"acc_stderr": 0.02788682807838055,
"acc_norm": 0.7563025210084033,
"acc_norm_stderr": 0.02788682807838055
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.01380278022737736,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.01380278022737736
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519513,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515368,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515368
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8480204342273308,
"acc_stderr": 0.012837852506645216,
"acc_norm": 0.8480204342273308,
"acc_norm_stderr": 0.012837852506645216
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617893,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5508379888268157,
"acc_stderr": 0.01663583834163193,
"acc_norm": 0.5508379888268157,
"acc_norm_stderr": 0.01663583834163193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958154,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958154
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7491961414790996,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.7491961414790996,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396154,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236834,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236834
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5417209908735332,
"acc_stderr": 0.012725701656953642,
"acc_norm": 0.5417209908735332,
"acc_norm_stderr": 0.012725701656953642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5765003665263857,
"mc2_stderr": 0.0150600091771299
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
giuseppemartino/i-SAID_custom | 2023-10-10T15:43:47.000Z | [
"region:us"
] | giuseppemartino | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6362576122.0
num_examples: 840
- name: validation
num_bytes: 905977299.0
num_examples: 99
download_size: 7262651438
dataset_size: 7268553421.0
---
# Dataset Card for "i-SAID_custom"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.