id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
DataLinguistic/MutiDataset | 2023-09-02T15:25:35.000Z | [
"license:apache-2.0",
"region:us"
] | DataLinguistic | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
vevomalik7766/images | 2023-09-02T15:33:55.000Z | [
"region:us"
] | vevomalik7766 | null | null | null | 0 | 0 | Entry not found |
EuRoxxx/Gielavocal | 2023-09-02T15:37:08.000Z | [
"license:openrail",
"region:us"
] | EuRoxxx | null | null | null | 0 | 0 | ---
license: openrail
---
|
coralexbadea/monitorul_trial_full | 2023-09-02T15:41:10.000Z | [
"region:us"
] | coralexbadea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6473700
num_examples: 3622
download_size: 2519094
dataset_size: 6473700
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
siddharthbulia/therapy-data-set-llama | 2023-09-02T15:55:25.000Z | [
"region:us"
] | siddharthbulia | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 85705
num_examples: 661
download_size: 26557
dataset_size: 85705
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "therapy-data-set-llama"
Created Dataset particularly focussed on conversations between a therapist and a patient which can be directly used for training of llama models.
Raw Dataset is picked from [Pandora](https://github.com/avocadopelvis/pandora)
Eg.
Patient: Hi
Therapist: Hello there. Tell me how are you feeling today?
Patient: Is anyone there?
Therapist: Hello there. Glad to see you're back. What's going on in your world right now?
Patient: Good morning
Therapist: Good morning. I hope you had a good night's sleep. How are you feeling today?
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mozart-ai/info-qa | 2023-09-02T15:51:38.000Z | [
"region:us"
] | mozart-ai | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 103523
num_examples: 612
download_size: 28661
dataset_size: 103523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "info-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reichenbach/drug_combi_instruct | 2023-09-02T15:58:10.000Z | [
"region:us"
] | reichenbach | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: doc_id
dtype: string
- name: sentence
dtype: string
- name: spans
list:
- name: span_id
dtype: int64
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: token_start
dtype: int64
- name: token_end
dtype: int64
- name: rels
list:
- name: class
dtype: string
- name: spans
sequence: int64
- name: is_context_needed
dtype: bool
- name: paragraph
dtype: string
- name: source
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 5946054
num_examples: 1362
download_size: 2966437
dataset_size: 5946054
---
# Dataset Card for "drug_combi_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
snats/chico | 2023-09-02T16:01:14.000Z | [
"license:cc-by-4.0",
"region:us"
] | snats | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
reichenbach/drug_combi_instruct_test | 2023-09-02T16:03:29.000Z | [
"region:us"
] | reichenbach | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: doc_id
dtype: string
- name: sentence
dtype: string
- name: spans
list:
- name: span_id
dtype: int64
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: token_start
dtype: int64
- name: token_end
dtype: int64
- name: rels
list:
- name: class
dtype: string
- name: spans
sequence: int64
- name: is_context_needed
dtype: bool
- name: paragraph
dtype: string
- name: source
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 1230393
num_examples: 272
download_size: 633198
dataset_size: 1230393
---
# Dataset Card for "drug_combi_instruct_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nasssss/TetoV3PorNas1 | 2023-09-02T19:49:16.000Z | [
"region:us"
] | Nasssss | null | null | null | 0 | 0 | Entry not found |
adolfocesar1/models-and-etc | 2023-09-02T16:07:13.000Z | [
"region:us"
] | adolfocesar1 | null | null | null | 0 | 0 | Entry not found |
R7-012/dump | 2023-10-10T23:41:18.000Z | [
"region:us"
] | R7-012 | null | null | null | 0 | 0 | Entry not found |
Leonliclash/AI | 2023-09-02T16:40:49.000Z | [
"region:us"
] | Leonliclash | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Devio__test-3b | 2023-09-02T16:43:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Devio/test-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/test-3b](https://huggingface.co/Devio/test-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test-3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T16:42:09.049307](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-3b/blob/main/results_2023-09-02T16%3A42%3A09.049307.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2377565764307277,\n\
\ \"acc_stderr\": 0.030680693861815076,\n \"acc_norm\": 0.23959569657570515,\n\
\ \"acc_norm_stderr\": 0.030688989155040824,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.41415759101311883,\n\
\ \"mc2_stderr\": 0.014688710447803573\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.25597269624573377,\n \"acc_stderr\": 0.012753013241244521,\n\
\ \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35988846843258315,\n\
\ \"acc_stderr\": 0.004789865379084508,\n \"acc_norm\": 0.4479187412865963,\n\
\ \"acc_norm_stderr\": 0.004962638446396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152922,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152922\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.29354838709677417,\n \"acc_stderr\": 0.02590608702131929,\n \"\
acc_norm\": 0.29354838709677417,\n \"acc_norm_stderr\": 0.02590608702131929\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444455,\n \"\
acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444455\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.16,\n \"acc_stderr\": 0.036845294917747094,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.036845294917747094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838056,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838056\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780305,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780305\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849303,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849303\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.20085470085470086,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n\
\ \"acc_stderr\": 0.016160871405127543,\n \"acc_norm\": 0.28607918263090676,\n\
\ \"acc_norm_stderr\": 0.016160871405127543\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.1994219653179191,\n \"acc_stderr\": 0.021511900654252524,\n\
\ \"acc_norm\": 0.1994219653179191,\n \"acc_norm_stderr\": 0.021511900654252524\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574875,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574875\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19292604501607716,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.19292604501607716,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.208955223880597,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789424,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789424\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.41415759101311883,\n\
\ \"mc2_stderr\": 0.014688710447803573\n }\n}\n```"
repo_url: https://huggingface.co/Devio/test-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|arc:challenge|25_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hellaswag|10_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T16:42:09.049307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T16:42:09.049307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T16:42:09.049307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T16:42:09.049307.parquet'
- config_name: results
data_files:
- split: 2023_09_02T16_42_09.049307
path:
- results_2023-09-02T16:42:09.049307.parquet
- split: latest
path:
- results_2023-09-02T16:42:09.049307.parquet
---
# Dataset Card for Evaluation run of Devio/test-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/test-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/test-3b](https://huggingface.co/Devio/test-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__test-3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T16:42:09.049307](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-3b/blob/main/results_2023-09-02T16%3A42%3A09.049307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2377565764307277,
"acc_stderr": 0.030680693861815076,
"acc_norm": 0.23959569657570515,
"acc_norm_stderr": 0.030688989155040824,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.41415759101311883,
"mc2_stderr": 0.014688710447803573
},
"harness|arc:challenge|25": {
"acc": 0.25597269624573377,
"acc_stderr": 0.012753013241244521,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252428
},
"harness|hellaswag|10": {
"acc": 0.35988846843258315,
"acc_stderr": 0.004789865379084508,
"acc_norm": 0.4479187412865963,
"acc_norm_stderr": 0.004962638446396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152922,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152922
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444455,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444455
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.02788682807838056,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.02788682807838056
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780305,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780305
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849303,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849303
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127543,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127543
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.1994219653179191,
"acc_stderr": 0.021511900654252524,
"acc_norm": 0.1994219653179191,
"acc_norm_stderr": 0.021511900654252524
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574875,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574875
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19292604501607716,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.19292604501607716,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.208955223880597,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.208955223880597,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789424,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789424
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.41415759101311883,
"mc2_stderr": 0.014688710447803573
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
profetize/kirsten_v4 | 2023-09-02T17:00:24.000Z | [
"region:us"
] | profetize | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: Filename
dtype: string
- name: URL
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 64737198.97551546
num_examples: 2793
- name: test
num_bytes: 21602244.699312713
num_examples: 932
- name: validate
num_bytes: 21579066.32517182
num_examples: 931
download_size: 63041115
dataset_size: 107918510.0
---
# Dataset Card for "kirsten_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4 | 2023-09-23T08:18:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4](https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:18:36.826328](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4/blob/main/results_2023-09-23T08-18-36.826328.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02160234899328859,\n\
\ \"em_stderr\": 0.0014888393578850604,\n \"f1\": 0.07773175335570466,\n\
\ \"f1_stderr\": 0.0019038640159988432,\n \"acc\": 0.4092104767130632,\n\
\ \"acc_stderr\": 0.009856677593330436\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02160234899328859,\n \"em_stderr\": 0.0014888393578850604,\n\
\ \"f1\": 0.07773175335570466,\n \"f1_stderr\": 0.0019038640159988432\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.0073906544811082045\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|drop|3_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-18-36.826328.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-18-36.826328.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|winogrande|5_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-18-36.826328.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- results_2023-09-02T17:03:57.703003.parquet
- split: 2023_09_23T08_18_36.826328
path:
- results_2023-09-23T08-18-36.826328.parquet
- split: latest
path:
- results_2023-09-23T08-18-36.826328.parquet
---
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4](https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:18:36.826328](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4/blob/main/results_2023-09-23T08-18-36.826328.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850604,
"f1": 0.07773175335570466,
"f1_stderr": 0.0019038640159988432,
"acc": 0.4092104767130632,
"acc_stderr": 0.009856677593330436
},
"harness|drop|3": {
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850604,
"f1": 0.07773175335570466,
"f1_stderr": 0.0019038640159988432
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.0073906544811082045
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
maisi7/mflachs | 2023-09-02T17:16:16.000Z | [
"region:us"
] | maisi7 | null | null | null | 0 | 0 | Entry not found |
saritha123/saritha321 | 2023-09-02T17:28:18.000Z | [
"license:openrail",
"region:us"
] | saritha123 | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Devio__testC | 2023-09-02T17:28:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Devio/testC
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/testC](https://huggingface.co/Devio/testC) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__testC\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T17:27:16.860385](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__testC/blob/main/results_2023-09-02T17%3A27%3A16.860385.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28185588236286707,\n\
\ \"acc_stderr\": 0.03225753349873974,\n \"acc_norm\": 0.2855290591736718,\n\
\ \"acc_norm_stderr\": 0.03226027924923892,\n \"mc1\": 0.20318237454100369,\n\
\ \"mc1_stderr\": 0.014085666526340882,\n \"mc2\": 0.35665813452391837,\n\
\ \"mc2_stderr\": 0.014271431688144938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094097,\n\
\ \"acc_norm\": 0.39590443686006827,\n \"acc_norm_stderr\": 0.014291228393536583\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4529974108743278,\n\
\ \"acc_stderr\": 0.004967685204073108,\n \"acc_norm\": 0.6287592113124876,\n\
\ \"acc_norm_stderr\": 0.004821492994082116\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926603,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.0339175032232166,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.0339175032232166\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3258064516129032,\n \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\"\
: 0.3258064516129032,\n \"acc_norm_stderr\": 0.0266620105785671\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0317852971064275\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"\
acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.024413587174907412,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.024413587174907412\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212237,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212237\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n\
\ \"acc_stderr\": 0.014583812465862553,\n \"acc_norm\": 0.210727969348659,\n\
\ \"acc_norm_stderr\": 0.014583812465862553\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.02259870380432162,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.02259870380432162\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881664,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881664\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.02419180860071301,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.02419180860071301\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843003,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843003\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142695,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142695\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22058823529411764,\n \"acc_stderr\": 0.01677467236546851,\n \
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.01677467236546851\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.14619883040935672,\n \"acc_stderr\": 0.027097290118070803,\n\
\ \"acc_norm\": 0.14619883040935672,\n \"acc_norm_stderr\": 0.027097290118070803\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20318237454100369,\n\
\ \"mc1_stderr\": 0.014085666526340882,\n \"mc2\": 0.35665813452391837,\n\
\ \"mc2_stderr\": 0.014271431688144938\n }\n}\n```"
repo_url: https://huggingface.co/Devio/testC
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:27:16.860385.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- results_2023-09-02T17:27:16.860385.parquet
- split: latest
path:
- results_2023-09-02T17:27:16.860385.parquet
---
# Dataset Card for Evaluation run of Devio/testC
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/testC
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/testC](https://huggingface.co/Devio/testC) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__testC",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T17:27:16.860385](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__testC/blob/main/results_2023-09-02T17%3A27%3A16.860385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28185588236286707,
"acc_stderr": 0.03225753349873974,
"acc_norm": 0.2855290591736718,
"acc_norm_stderr": 0.03226027924923892,
"mc1": 0.20318237454100369,
"mc1_stderr": 0.014085666526340882,
"mc2": 0.35665813452391837,
"mc2_stderr": 0.014271431688144938
},
"harness|arc:challenge|25": {
"acc": 0.35494880546075086,
"acc_stderr": 0.013983036904094097,
"acc_norm": 0.39590443686006827,
"acc_norm_stderr": 0.014291228393536583
},
"harness|hellaswag|10": {
"acc": 0.4529974108743278,
"acc_stderr": 0.004967685204073108,
"acc_norm": 0.6287592113124876,
"acc_norm_stderr": 0.004821492994082116
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926603,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.0339175032232166,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.0339175032232166
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3258064516129032,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.3258064516129032,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3431192660550459,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.3431192660550459,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907412,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907412
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212237,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212237
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862553,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.02259870380432162,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.02259870380432162
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881664,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881664
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.02419180860071301,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.02419180860071301
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843003,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843003
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142695,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142695
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.01677467236546851,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.01677467236546851
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.14619883040935672,
"acc_stderr": 0.027097290118070803,
"acc_norm": 0.14619883040935672,
"acc_norm_stderr": 0.027097290118070803
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20318237454100369,
"mc1_stderr": 0.014085666526340882,
"mc2": 0.35665813452391837,
"mc2_stderr": 0.014271431688144938
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Devio__test100 | 2023-09-02T17:30:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Devio/test100
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/test100](https://huggingface.co/Devio/test100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test100\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T17:29:14.649417](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test100/blob/main/results_2023-09-02T17%3A29%3A14.649417.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2766497501153852,\n\
\ \"acc_stderr\": 0.031976576858827,\n \"acc_norm\": 0.2798843599297305,\n\
\ \"acc_norm_stderr\": 0.031981630759923114,\n \"mc1\": 0.19706242350061198,\n\
\ \"mc1_stderr\": 0.013925080734473736,\n \"mc2\": 0.3401260823172781,\n\
\ \"mc2_stderr\": 0.014194140794117406\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3370307167235495,\n \"acc_stderr\": 0.013813476652902272,\n\
\ \"acc_norm\": 0.37372013651877134,\n \"acc_norm_stderr\": 0.014137708601759098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4312885879306911,\n\
\ \"acc_stderr\": 0.004942440746328494,\n \"acc_norm\": 0.5854411471818363,\n\
\ \"acc_norm_stderr\": 0.0049163889621423205\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496217,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3225806451612903,\n\
\ \"acc_stderr\": 0.02659308451657228,\n \"acc_norm\": 0.3225806451612903,\n\
\ \"acc_norm_stderr\": 0.02659308451657228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3522935779816514,\n \"acc_stderr\": 0.020480568843998997,\n \"\
acc_norm\": 0.3522935779816514,\n \"acc_norm_stderr\": 0.020480568843998997\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n\
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.18181818181818182,\n \"acc_stderr\": 0.035208939510976554,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.035208939510976554\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010102,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2973856209150327,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.1695906432748538,\n \"acc_stderr\": 0.028782108105401712,\n\
\ \"acc_norm\": 0.1695906432748538,\n \"acc_norm_stderr\": 0.028782108105401712\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.19706242350061198,\n\
\ \"mc1_stderr\": 0.013925080734473736,\n \"mc2\": 0.3401260823172781,\n\
\ \"mc2_stderr\": 0.014194140794117406\n }\n}\n```"
repo_url: https://huggingface.co/Devio/test100
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:29:14.649417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:29:14.649417.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:29:14.649417.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:29:14.649417.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_29_14.649417
path:
- results_2023-09-02T17:29:14.649417.parquet
- split: latest
path:
- results_2023-09-02T17:29:14.649417.parquet
---
# Dataset Card for Evaluation run of Devio/test100
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/test100
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/test100](https://huggingface.co/Devio/test100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__test100",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T17:29:14.649417](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test100/blob/main/results_2023-09-02T17%3A29%3A14.649417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2766497501153852,
"acc_stderr": 0.031976576858827,
"acc_norm": 0.2798843599297305,
"acc_norm_stderr": 0.031981630759923114,
"mc1": 0.19706242350061198,
"mc1_stderr": 0.013925080734473736,
"mc2": 0.3401260823172781,
"mc2_stderr": 0.014194140794117406
},
"harness|arc:challenge|25": {
"acc": 0.3370307167235495,
"acc_stderr": 0.013813476652902272,
"acc_norm": 0.37372013651877134,
"acc_norm_stderr": 0.014137708601759098
},
"harness|hellaswag|10": {
"acc": 0.4312885879306911,
"acc_stderr": 0.004942440746328494,
"acc_norm": 0.5854411471818363,
"acc_norm_stderr": 0.0049163889621423205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496217,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3225806451612903,
"acc_stderr": 0.02659308451657228,
"acc_norm": 0.3225806451612903,
"acc_norm_stderr": 0.02659308451657228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3522935779816514,
"acc_stderr": 0.020480568843998997,
"acc_norm": 0.3522935779816514,
"acc_norm_stderr": 0.020480568843998997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.035208939510976554,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.035208939510976554
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010102,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.1695906432748538,
"acc_stderr": 0.028782108105401712,
"acc_norm": 0.1695906432748538,
"acc_norm_stderr": 0.028782108105401712
},
"harness|truthfulqa:mc|0": {
"mc1": 0.19706242350061198,
"mc1_stderr": 0.013925080734473736,
"mc2": 0.3401260823172781,
"mc2_stderr": 0.014194140794117406
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NobodyExistsOnTheInternet/EconConversation | 2023-09-02T18:34:38.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5 | 2023-09-23T06:42:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of luffycodes/mcq-hal-vicuna-13b-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/mcq-hal-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-hal-vicuna-13b-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T06:42:27.712950](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5/blob/main/results_2023-09-23T06-42-27.712950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1556208053691275,\n\
\ \"em_stderr\": 0.0037122929662357657,\n \"f1\": 0.21929530201342184,\n\
\ \"f1_stderr\": 0.0037761120976932553,\n \"acc\": 0.40820339964803165,\n\
\ \"acc_stderr\": 0.010171078364256323\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1556208053691275,\n \"em_stderr\": 0.0037122929662357657,\n\
\ \"f1\": 0.21929530201342184,\n \"f1_stderr\": 0.0037761120976932553\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \
\ \"acc_stderr\": 0.007831458737058714\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/luffycodes/mcq-hal-vicuna-13b-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|arc:challenge|25_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T06_42_27.712950
path:
- '**/details_harness|drop|3_2023-09-23T06-42-27.712950.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-42-27.712950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T06_42_27.712950
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-42-27.712950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-42-27.712950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hellaswag|10_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:48:54.991558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T20:57:06.910156.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:48:54.991558.parquet'
- split: 2023_09_02T20_57_06.910156
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T20:57:06.910156.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T20:57:06.910156.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T06_42_27.712950
path:
- '**/details_harness|winogrande|5_2023-09-23T06-42-27.712950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-42-27.712950.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_48_54.991558
path:
- results_2023-09-02T17:48:54.991558.parquet
- split: 2023_09_02T20_57_06.910156
path:
- results_2023-09-02T20:57:06.910156.parquet
- split: 2023_09_23T06_42_27.712950
path:
- results_2023-09-23T06-42-27.712950.parquet
- split: latest
path:
- results_2023-09-23T06-42-27.712950.parquet
---
# Dataset Card for Evaluation run of luffycodes/mcq-hal-vicuna-13b-v1.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/mcq-hal-vicuna-13b-v1.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/mcq-hal-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-hal-vicuna-13b-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:42:27.712950](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5/blob/main/results_2023-09-23T06-42-27.712950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1556208053691275,
"em_stderr": 0.0037122929662357657,
"f1": 0.21929530201342184,
"f1_stderr": 0.0037761120976932553,
"acc": 0.40820339964803165,
"acc_stderr": 0.010171078364256323
},
"harness|drop|3": {
"em": 0.1556208053691275,
"em_stderr": 0.0037122929662357657,
"f1": 0.21929530201342184,
"f1_stderr": 0.0037761120976932553
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.007831458737058714
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chukypedro/d4 | 2023-09-02T17:54:20.000Z | [
"region:us"
] | chukypedro | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2 | 2023-09-02T18:00:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T17:59:05.420313](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-09-02T17%3A59%3A05.420313.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.700283718449465,\n\
\ \"acc_stderr\": 0.030924880314556678,\n \"acc_norm\": 0.7042589787396556,\n\
\ \"acc_norm_stderr\": 0.030893979991341382,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5863515695677809,\n\
\ \"mc2_stderr\": 0.015002713147024338\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177278,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6822346146186019,\n\
\ \"acc_stderr\": 0.004646561453031608,\n \"acc_norm\": 0.8698466440948018,\n\
\ \"acc_norm_stderr\": 0.0033578442491239554\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n\
\ \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216773,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216773\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.0228158130988966,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.0228158130988966\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \
\ \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.032178294207446306,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.032178294207446306\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.031722334260021585,\n \"\
acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.031722334260021585\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8773946360153256,\n\
\ \"acc_stderr\": 0.011728672144131565,\n \"acc_norm\": 0.8773946360153256,\n\
\ \"acc_norm_stderr\": 0.011728672144131565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.02196630994704311,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.02196630994704311\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5452513966480447,\n\
\ \"acc_stderr\": 0.016653875777524,\n \"acc_norm\": 0.5452513966480447,\n\
\ \"acc_norm_stderr\": 0.016653875777524\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093865,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093865\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5514993481095176,\n\
\ \"acc_stderr\": 0.012702317490559821,\n \"acc_norm\": 0.5514993481095176,\n\
\ \"acc_norm_stderr\": 0.012702317490559821\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7532679738562091,\n \"acc_stderr\": 0.017440820367402503,\n \
\ \"acc_norm\": 0.7532679738562091,\n \"acc_norm_stderr\": 0.017440820367402503\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.02520696315422538,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.02520696315422538\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5863515695677809,\n\
\ \"mc2_stderr\": 0.015002713147024338\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- results_2023-09-02T17:59:05.420313.parquet
- split: latest
path:
- results_2023-09-02T17:59:05.420313.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T17:59:05.420313](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-09-02T17%3A59%3A05.420313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.700283718449465,
"acc_stderr": 0.030924880314556678,
"acc_norm": 0.7042589787396556,
"acc_norm_stderr": 0.030893979991341382,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5863515695677809,
"mc2_stderr": 0.015002713147024338
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177278,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.6822346146186019,
"acc_stderr": 0.004646561453031608,
"acc_norm": 0.8698466440948018,
"acc_norm_stderr": 0.0033578442491239554
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02559185776138218,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02559185776138218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216773,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216773
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.0228158130988966,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.0228158130988966
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7436974789915967,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.7436974789915967,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813902,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813902
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.032178294207446306,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.032178294207446306
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.031722334260021585,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.031722334260021585
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8773946360153256,
"acc_stderr": 0.011728672144131565,
"acc_norm": 0.8773946360153256,
"acc_norm_stderr": 0.011728672144131565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.02196630994704311,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.02196630994704311
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5452513966480447,
"acc_stderr": 0.016653875777524,
"acc_norm": 0.5452513966480447,
"acc_norm_stderr": 0.016653875777524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093865,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093865
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5514993481095176,
"acc_stderr": 0.012702317490559821,
"acc_norm": 0.5514993481095176,
"acc_norm_stderr": 0.012702317490559821
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7532679738562091,
"acc_stderr": 0.017440820367402503,
"acc_norm": 0.7532679738562091,
"acc_norm_stderr": 0.017440820367402503
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.02520696315422538,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.02520696315422538
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5863515695677809,
"mc2_stderr": 0.015002713147024338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
prakhargupta94/recipe_300 | 2023-09-02T18:10:37.000Z | [
"region:us"
] | prakhargupta94 | null | null | null | 0 | 0 | Entry not found |
agusavior/agusavior | 2023-09-02T18:20:01.000Z | [
"license:mit",
"region:us"
] | agusavior | null | null | null | 0 | 0 | ---
license: mit
---
|
maitrang/viwiki_20230901 | 2023-09-02T18:29:42.000Z | [
"region:us"
] | maitrang | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1237663253
num_examples: 1287269
download_size: 562888266
dataset_size: 1237663253
---
# Dataset Card for "viwiki_20230901"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ketoxboompreis/slimmingkaufen | 2023-09-02T18:28:34.000Z | [
"region:us"
] | ketoxboompreis | null | null | null | 0 | 0 | Entry not found |
georgeiac00/giorgos-v1 | 2023-09-02T18:30:59.000Z | [
"region:us"
] | georgeiac00 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 3262852.707331852
num_examples: 22846
- name: validation
num_bytes: 398334.60396924726
num_examples: 3552
- name: test
num_bytes: 876501
num_examples: 5987
download_size: 2928861
dataset_size: 4537688.311301099
---
# Dataset Card for "giorgos-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
brazilianwoodsale/Brazilian-Wood-Male-Enhancement | 2023-09-02T18:33:18.000Z | [
"region:us"
] | brazilianwoodsale | null | null | null | 0 | 0 | <h1 style="text-align: left;">Brazilian Wood</h1>
<p><span style="font-family: georgia;"><strong>Product Name - Brazilian Wood<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Side Effects - No Side Effects (100% Natural)</strong></span></p>
<p><span style="font-family: georgia;"><strong>Main Benefits - Finally Regrow Your Manhood To Its Full Size<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Category - Male Enhancement<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Results - In Few Weeks<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Availability - Online</strong></span></p>
<p><span style="font-family: georgia;"><strong>Customer Reviews - ★★★★✰ 4.9/5</strong></span></p>
<p><span style="font-family: georgia;"><strong>Price - Visit <a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Official Website</a></strong></span></p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><span style="font-family: georgia;"><strong><span style="color: red;"><span style="background-color: #ffe599;">Get Huge Discount Now!!</span></span></strong></span></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong><span style="font-family: georgia;"><span style="background-color: #fff2cc;"><span style="color: red;">Special Discount- As Low As On Brazilian Wood – Get Your Best Discount Online Hurry!!</span></span></span></strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEik33eCyIdmWplJnKJ87sXqpPQ8bNVXMQrcvdbG3qfdH5BRTXEhYaJkWM66Xa-PPp5vfpqErgAMFm3Bd5cqvTDEc6BHnPmgiQ6ixKuEJEyzTdODLMl3is2kADkTdpqqesvQbcpfCfs-6aOtl86fi3PDk-o7VmXRJLqBcTDI4_wsF68V6bxnaJAc31ZVxCyh/w640-h276/Brazilian%20Wood%20Male%20Enhancement%201.png" alt="" width="640" height="276" border="0" data-original-height="386" data-original-width="897" /></a></div>
<p>Various factors affect a person’s sex life. Such issues will interfere with a person’s sexual experiences in the long term and may cause a strain in their relationships.</p>
<p>To avoid issues related to male sexual performance, most people turn to sex enhancement supplements for help. One such supplement is <a href="https://bitbucket.org/brazilian-wood-male-enhancement/brazilian-wood-male-enhancement/issues/1/brazilian-wood-male-enhancement-lab-tested">Brazilian Wood Male Enhancement</a>.</p>
<p>There are many reasons why you may be diagnosed with low semen volume. Some of these reasons include weak pelvic muscles, low testosterone, retrograde ejaculation, or psychological problems.</p>
<p>We know that the quantity of semen and strength of release varies among men. Therefore, it is a great idea to visit health practitioners to understand the underlying causes of their sexual problems.</p>
<p>However, people still use enhancement supplements to treat symptoms of sexual dysfunction.</p>
<p>Brazilian Wood Male Enhancement is a common supplement that people use for treating problems related to men’s sexual health. However, it is best to know more about a product before using it, especially products as sensitive as supplements for men’s sexual health.</p>
<p>This article covers how <a href="https://www.dibiz.com/brazilianwoodsale">Brazilian Wood Male Enhancement</a> works, its ingredients, benefits, side effects and where you can purchase the supplement.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong><span style="font-family: georgia;"><span style="background-color: #d9d2e9;"><span style="color: red;">SALE IS LIVE</span></span></span></strong></a></h2>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong><span style="font-family: georgia;"><span style="background-color: #ffe599;">Get <span style="color: red;">Brazilian Wood </span> “Now Available” Hurry Limited Time Offer Only For 1st User!!</span></span></strong></a></h2>
<h2 style="text-align: left;"><strong>What Is <a href="https://sites.google.com/view/brazilian-wood-male-enhan/">Brazilian Wood Male Enhancement</a>?</strong></h2>
<p><a href="https://www.podcasts.com/brazilian-wood-reviews">Brazilian Wood</a> is a natural male health enhancement supplement packed with nutrients that produce fast and safe results for men suffering from performance issues.</p>
<p>This capsular supplement contains several natural ingredients such as magnesium, horny goat weed and chrysin that have been clinically backed by several research trials conducted on animals and humans.</p>
<p>We know that most men do not like to talk about their problems when it comes to private health. However, ignoring your problems is no solution to them.</p>
<p>This is why created Brazilian Wood. As the label describes, this product is a “Peak Performance supplement,” helping men achieve peak performance.</p>
<p>Since its launch, the supplement has received several positive reviews, most of which state that it has helped change lives and bring back the spark the users thought was long lost.</p>
<p>This is what piqued our interest. As we went through the several Brazilian Wood reviews, we wondered what it really is that works so well in the formula or whether it is just empty hype.</p>
<p>This is why, in the article, we have decided to review Brazilian Wood, break down the product into multiple components for analysis and come to a conclusion.</p>
<h3 style="text-align: center;"><span style="background-color: #d9ead3;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">LIMITED TIME OFFER</a></strong></span></span></span></h3>
<h3 style="text-align: center;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Click Here to Order Brazilian Wood at Special Discounted Price</a></strong></span></span></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgB5C2Dtl5JiiRl9aCNlJnsAI8aV0csbCK71nIRYwkpHgXYYvWLfWlSXJkME1DpP7zNiFYcDo2h-tpcXxsY3i7xJNcYI0MeOS-b4C89q7sTG5qDmYvYRI_rIxR127vZwg0h_g5zFzM-hG5XUgthQnXRf0fJ00cpzN3YJPHxYwbOFAkZAnJd-dsikXQQzGiF/w585-h439/Brazilian%20Wood%20Male%20Enhancement%209.png" alt="" width="585" height="439" border="0" data-original-height="1050" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>Core Mechanism – How Does <a href="https://www.eventcreate.com/e/brazilian-wood-male-enhancement">Brazilian Wood</a> Work?</strong></h2>
<p>Firstly, <a href="https://brazilian-wood-us-news.clubeo.com/calendar/2023/09/01/brazilian-wood-1-advanced-formula-for-enhanced-size-longer-endurance-larger-erection-work-or-hoax">Brazilian Wood</a> aims to improve blood flow and claims to boost nitric oxide production. This is an important aspect of sexual health, as nitric oxide is a signaling molecule that helps to regulate blood flow and other physiological processes.</p>
<p>By increasing the amount of nitric oxide in the body, <a href="https://brazilianwoodsale.cgsociety.org/2jwo/brazilian-wood-male-">Brazilian Wood</a> can help improve blood flow to other body parts, such as the heart and brain.</p>
<p>Finally, it includes ingredients that can boost testosterone levels. Testosterone is a hormone that plays a crucial role in male sexual health. It regulates drive, sperm production and muscle mass, among other things.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">MUST SEE: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">“Critical News Brazilian Wood Report – They Will Never Tell You This”</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkQeP19s_N2XdDGfM5ckT_J0924y_FpwpaslILv0ruw1mGZJ8lts-rVTuUH1GmF3UikhZwKRmnOB9rr7VzQnBUKj53WOiS_eQ5XQTwi1xp6p6BxMfA86hdljDQVCkxCNRzejmjUgR5nqrHMu_uU6-RkeaDDuVUzO0CoqZEmcQ1ehJEex-oOW86_vUdqMgF/w640-h224/Brazilian%20Wood%20Male%20Enhancement%202.jpg" alt="" width="640" height="224" border="0" data-original-height="523" data-original-width="1500" /></a></div>
<h2 style="text-align: left;"><strong>What Are The Natural Ingredients In The <a href="https://brazilian-wood-male-enhancement-supplement.mystrikingly.com/">Brazilian Wood</a> Formula?</strong></h2>
<p><strong><a href="https://soundcloud.com/brazilian-wood-male-enhancement-reviews/brazilian-wood-male-enhancement-lab-tested-enhancing-virility-boosting-stamina-and-performance">Brazilian Wood</a> is made with select natural ingredients clinically proven to promote female sexual health. Let’s take a look at some of these ingredients below:</strong></p>
<p><strong>Chrysin:</strong> Chrysin is a naturally occurring flavonoid found in several plants, including passionflower, honey and bee pollen. It is a yellow crystalline compound studied for potential health benefits, including its ability to promote overall sexual health in men.</p>
<p>One of the ways that chrysin works is by increasing the levels of nitric oxide in the body, which helps the blood vessels dilate, allowing for increased blood flow.</p>
<p>Another way that chrysin works to promote sexual health is by inhibiting the enzyme aromatase. Aromatase is responsible for converting testosterone into estrogen. By inhibiting aromatase, chrysin helps to increase the levels of free testosterone in the body.</p>
<p><strong>Epimedium:</strong> One of the core mechanisms of epimedium is its ability to increase levels of nitric oxide in the body. Nitric oxide is a molecule that helps dilate blood vessels and improve blood flow, which can help improve sexual function and promote overall male health.</p>
<p>This increased blood flow can also positively affect production, as it helps provide the necessary nutrients and oxygen to the testes.</p>
<p>A study conducted in 2017 found that epimedium extract was able to improve sperm motility and DNA integrity in male rats. This is particularly important for men trying to conceive, as healthy sperm motility and DNA integrity are crucial for successful fertilization.</p>
<p><strong>Saw Palmetto:</strong> Saw palmetto works in various ways to promote bigger erections and overall sexual health in men. One of the primary mechanisms is by blocking the enzyme 5-alpha-reductase. This enzyme is responsible for converting testosterone into dihydrotestosterone (DHT), which can contribute to prostate enlargement and other male health issues.</p>
<p>By blocking this enzyme, saw palmetto can help reduce DHT levels and promote prostate health.</p>
<p>Another way that saw palmetto promotes sexual health is by increasing the availability of free testosterone in the body. It does this by inhibiting the binding of testosterone to sex hormone-binding globulin (SHBG). This allows more testosterone for use by the body and can help improve sexual function.</p>
<p><strong>Tongkat Ali:</strong> Tongkat Ali contains bioactive compounds, including eurycomanone and quassinoids, which have been shown to have aphrodisiac properties that stimulate sexual desire and improve erectile function (ED). Furthermore, Tongkat Ali has been found to increase testosterone levels, a key hormone for male health and vitality.</p>
<p>A study published in the Journal of the International Society of Sports Nutrition investigated the effects of Tongkat Ali supplementation on male athletes’ hormonal profile and body composition. The study involved 14 male athletes who were given 100 mg of Tongkat Ali extract daily for five weeks.</p>
<p>The results showed that Tongkat Ali supplementation significantly increased testosterone levels by 37 percent. In addition, the participants had a significant increase in lean body mass and a significant decrease in fat mass.</p>
<p><strong>Winged Treebine:</strong> Winged treebine, also known as cayratia japonica, is a climbing vine plant native to Southeast Asia. Its leaves and stems contain a potent blend of natural compounds that work in various ways to enhance sexual performance and overall well-being.</p>
<p>Winged treebine contains plant sterols structurally similar to testosterone, which can help stimulate the body’s natural production of this hormone. This can lead to increased drive, improved sexual performance and greater overall energy and stamina.</p>
<h3 style="text-align: left;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">To Learn More about Premium Brazilian Wood Ingredients in Detail, Click Here to Head to Its Official Website</a></strong></span></span></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgbZnP2YC0U3jPoTxLmG0RRVfz3CY9nOHF5saC3TybalnGOHs_kojzbicarDx-vyT8JrA7L96p6HmdXyR52AZeVs0jrdAPcElbu_G6l7yJMYWPTPp--pJXKFoqeVWDlZsYOvXthdI1Y_GbxMrww7i346BeBKidZHxdM7bl6-D7faYtj4L-9G9BUx_azKxU8/w640-h312/Brazilian%20Wood%20Male%20Enhancement%207.png" alt="" width="640" height="312" border="0" data-original-height="681" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>Benefits of Using <a href="https://brazilian-wood-male-enhancement-supplement.jimdosite.com/">Brazilian Wood Male Enhancement</a><br /></strong></h2>
<p><strong>It Enhances Your Sex Life:</strong> <a href="https://brazilianwood-male-enhancement.hashnode.dev/brazilian-wood-male-enhancement-lab-tested-enhancing-virility-boosting-stamina-and-performancespam-or-legit">Brazilian Wood Male Enhancement</a> makes the climax stronger and more enjoyable. Most of the reviews this product gets from the brand’s official site have been positive. However, of all the reviews, improvement in sexual satisfaction has been a constant—men who use this pill notice more responsivity and longevity after their first month. Therefore, Brazilian Wood Male Enhancement gives room for men to satisfy their partners more.</p>
<p><strong>It Improves Count and Quality:</strong> The Brazilian Wood Male Enhancement composition affects the bulbourethral gland. This gland manufactures the mucus-like fluid. Also, it moisturizes the urethra with its thick viscosity. Brazilian Wood Male Enhancement then works on the bulbourethral gland.</p>
<p><strong>Brazilian Wood Male Enhancement Increases the Volume In Men:</strong> You may have issues with volume or quantity, but don’t be too bothered; you are not alone. Every man has a prostate gland that grows as time goes on and this causes a reduction in volume.</p>
<p>The prostate gland secretes a liquid that makes up 25% of the fluid. Inside the prostate gland is the urethra. A reduction in fluid gets released when the prostrate gets engorged and the urethra squeezes.</p>
<p>To solve this reduction issue, Brazilian Wood Male Enhancement improves the health of the prostate leading to more volume.</p>
<p><strong>Higher libido:</strong> This supplement contains the “horny goat weed,” popular for libido and testosterone levels. It also improves blood circulation.</p>
<p><strong>Boosts Your Sexual Desires: </strong><a href="https://brazilian-wood-male-enhancement-reviews.bandcamp.com/track/brazilian-wood-male-enhancement-lab-tested-enhancing-virility-boosting-stamina-and-performance-spam-or-legit">Brazilian Wood Male Enhancement</a> can also be customized to meet your sexual health needs at a particular period.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Read This: <a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><span style="background-color: #ffe599; color: red;">"More Information From Knowledgeable Expertise of Health Labs Brazilian Wood"</span></a></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEilwc8GbuZYMSAn_2e8rH91PFPnbOGfCV-NgVGO8XZvPXX1pfqmX2M1DzxhSdZ3rqX3hXWO_nq91GZd-YUhbY9oSQ-xKJd1Oe634BGOjObPmza-2PUllL_Ysa5nCBY356fq3CSB-2VorsyvrFv4NSb97cfsZxdmSmhczFbzTfRvpYPYxkoh9bwyWqF-YqhY/w640-h222/Brazilian%20Wood%20Male%20Enhancement%203.jpg" alt="" width="640" height="222" border="0" data-original-height="519" data-original-width="1500" /></a></div>
<h2 style="text-align: left;"><strong>There's Any Side effects of Using <a href="https://brazilian-wood-male-enhancement-supplement.company.site/">Brazilian Wood Male Enhancement</a>?<br /></strong></h2>
<p>Users should be conscious of the drawbacks of <a href="https://www.podcasts.com/brazilian-wood-reviews/episode/brazilian-wood-lab-tested-enhancing-virility-boosting-stamina-and-performancespam-or-legit">Brazilian Wood Male Enhancement</a>, as this knowledge could significantly affect their experience. The first thing users will note about this product is its ease of use, as well as the moderate discomfort and headache-like adverse effects, which are usual but not severe with male enhancement pills.</p>
<p>The negative effects are modest and brief, and they disappear rapidly when the body adjusts to the supplement's components.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>IMPORTANT: <span style="background-color: #ffe599;"><span style="color: red;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Shocking Truth About Brazilian Wood – This May Change Your Mind!</a></span></span></strong></span></h3>
<h2 style="text-align: left;"><strong><a href="https://brazilian-wood-male-enhancement-supplement.webflow.io/">Brazilian Wood</a> Advantages <br /></strong></h2>
<p style="text-align: left;">• Increases sperm production and motility.</p>
<p style="text-align: left;">• Has advantages outside sexual health.</p>
<p style="text-align: left;">• Ingredients target erectile dysfunction.</p>
<p style="text-align: left;">• Quick and free delivery on all orders.</p>
<p style="text-align: left;">• Comes with a 100 day product guarantee.</p>
<h2 style="text-align: left;"><strong><a href="https://colab.research.google.com/drive/1hVH-FXkc8du2s_J9XMMpbkWJ6bRw6rJU?usp=sharing">Brazilian Wood</a> Disdvantages</strong></h2>
<p style="text-align: left;">• Several components lack substantial research.</p>
<p style="text-align: left;">• Maximum outcomes could take as long as 90 days.</p>
<p style="text-align: left;">• May result in an elevated heart rate.</p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong style="background-color: #ffe599; color: red; font-family: georgia;">Save big on Brazilian Wood – shop now while it's on sale!</strong></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong style="background-color: #ffe599; color: red; font-family: georgia;">Place your order today before stock runs out!</strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgC1Kubcp0KeyBXob0bp8Z_QFtp3jkLU8p7xSTE3f6uGT_W-_inIoDZzOa1G5a6U14P0DF0C-y4wbSHomEfBC0C9vsMjS2B3GzgOJ7Pq1_jN0Ps6t4MMFjn5aPMLWcy3oCW_LYlPVVpdFTU2eAJcEHo2kxskwhDoDZ9xVH2l8EXPiL0PT64S8pMNdJRxVHm/w640-h224/Brazilian%20Wood%20Male%20Enhancement%205.jpg" alt="" width="640" height="224" border="0" data-original-height="475" data-original-width="1359" /></a></div>
<h2 style="text-align: left;"><strong>Dosage and Tips to Use <a href="https://lookerstudio.google.com/reporting/560ee9dd-b2e9-49ff-9563-9042a549253a">Brazilian Wood Male Enhancement</a><br /></strong></h2>
<p>A bottle of <a href="https://animale-cannabinoids-news.clubeo.com/calendar/2023/09/01/brazilian-wood-100-increase-pleasure-reduce-anxiety-boost-moods-length-girth-real-or-hoax">Brazilian Wood Male Enhancement</a> comes with 120 tablets. For optimum effectiveness, you should take the pills at least 10 minutes before eating your meal. For starters, tablets should be taken 2 to 3 times daily or as prescribed by your physician. You can take the supplement every day, or for the duration your physician specifies.</p>
<p>Note that the recommended dosage also depends on your body weight. Do not take more than one pill at a time. Furthermore, exercise regularly and ensure you eat well.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>READ ALSO: <a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><span style="background-color: #ffe599;"><span style="color: red;">Does the Brazilian Wood Work For Everyone? Before you buy, read real customer reviews and testimonials!!</span></span></a></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7nl9CQyXGDO2mL6Ram6R2WD_i0i-tYhnEAk9o-mkRct2BRfuMqtpAyOZGpdDVv8TN-PUiRMSEb2a9L4Sep6wlD02G6GTK2w348lo_oopi2wRcoIkVyGcnUrsrtogljkwVgqbV2thKvy3EiGrsoebd0c8R07xrkXAgrwyVl7XV15rPmf3SpcmkDlepH6LN/w640-h282/Brazilian%20Wood%20Male%20Enhancement%206.jpg" alt="" width="640" height="282" border="0" data-original-height="697" data-original-width="1585" /></a></div>
<h2 style="text-align: left;"><strong><a href="https://sway.office.com/wi9XATF8g6U9xBjE?ref=Link&loc=mysways">Brazilian Wood Male Enhancement</a> Pricing & Guarantee</strong></h2>
<p><a href="https://haitiliberte.com/advert/brazilian-wood-male-enhancement-lab-tested-enhancing-virility-boosting-stamina-and-performancespam-or-legit/">Brazilian Wood Male Enhancement</a> is one of the best natural male enhancement supplements for vitality and libido. If you’re ready to try Brazilian Wood Male Enhancement for yourself, then the best place to order is directly through the official website.</p>
<p>No matter what package you select, you are covered by the manufacturer’s 60-day money back guarantee. If for any reason you are dissatisfied with your experience while using Brazilian Wood Male Enhancement or don’t like the results, then you can receive a refund within 60 days of purchasing the product. Simply contact the manufacturer and you’ll receive a full refund – no questions asked.</p>
<p><strong>These are the <a href="https://devfolio.co/projects/brazilian-wood-boosting-stamina-and-performance-d35e">Brazilian Wood</a> costs which decline while getting more units simultaneously:</strong></p>
<p><strong>Basic -</strong> 1 Bottle Supply of Brazilian Wood USD 69/bottle + SMALL SHIPPING.<span style="color: red;"><br /></span></p>
<p><strong>Popular Pack -</strong> Buy 3 Get Bottle Supply of Brazilian Wood USD 59/bottle + FREE SHIPPING.</p>
<p><strong>Best Value Pack - </strong>Buy 6 Bottle Supply of Brazilian Wood USD 49/bottle + FREE SHIPPING.</p>
<p style="text-align: left;"><a href="https://www.yepdesk.com/brazilian-wood-male-enhancement-lab-tested-enhancing-virility-boosting-stamina-and-performance">Brazilian Wood</a> Payments are made using 256-bit SSL technology to keep information safe and secure, and all orders arrive within a few business days of ordering.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Special Offer: <span style="background-color: #fff2cc; color: red;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Click Here To Get Heavy Discount Instantly!!</a></span></strong></h3>
<p><span style="font-family: times;"><span style="font-size: medium;"><span style="color: red;">Good News: Get additional discount on shipping when you checkout with Mastercard or Discover card!</span></span></span></p>
<div class="separator" style="clear: both; text-align: center;">
<p style="text-align: left;"><span style="font-size: medium;"><a style="clear: left; float: left; margin-bottom: 1em; margin-left: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/a/AVvXsEgJqDXBj2s2sKgxhjLGKnDNPxD392fUjUkF8lQbqbuoFZwPHnPE27muXA18Hs1EzbsUHHsPlOR9Njx119fwMPFiCrLv9NlRRfEUdLPeIVlqZmqjexv1dJ0pMoSO6VUtSY89rewM_LiPyGpkGpNCHHdprDSvrWyt6MprtcceNFal6bdDPK_FyvLHnQzy-A" alt="" width="110" height="120" border="0" data-original-height="120" data-original-width="110" /></a><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">APPROVED!</span><br /></strong></span></span></span></p>
<p style="text-align: left;"><span style="font-family: helvetica;"><span style="font-size: small;">Limited supply available. We currently have product in stock and ready to ship within <span style="color: red;">24 hours</span>.</span></span></p>
</div>
<p><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">EXPIRE SOON</span></strong></span></span></p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiisYiJ0ElJAxCDaw8s4ZC7XLKo7pfLfu9kXCJllEQS4x2lV9r9L0ObMgdrv9qp17IFZJWgUUTORVfpol_8CzVF6bDXRdwZTnOExAiWRMlSyIBeB7XqiWql1CGXvphuCEJ5qhOes7sDHA8PNSZ9qxlgkRxvPdnpIzXpJA5hG_ThBRE8peep8Twv5Wg809Kn/w484-h484/Brazilian%20Wood%20Male%20Enhancement%2012.png" alt="" width="484" height="484" border="0" data-original-height="1400" data-original-width="1400" /></a></div>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSJvORHtAeEI3H2rypjo7v70Cm2j2tC1B-Ja0K1qVp1MEYhmISktm3oeSPvmtOjcgIp6VWYex2WQ2w6gsXFZPdis4AmxfwRGftHtwSK5PNs5-vJjhVZwsNY6SljpUWbanRSWMbVUibr78lOgAkjowIEGQGH8g4my7mrAF8bND5KSQ7K8qU9d1qadr8WA/w327-h97/btn.png" alt="" width="327" height="97" border="0" data-original-height="84" data-original-width="282" /></a></div>
<p style="text-align: center;">By submitting, you affirm to have read and agreed to our <a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><span style="color: red;">Terms & Conditions</span></a>.</p>
<p style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><span style="font-size: medium;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>HUGE SAVINGS Get Your Brazilian Wood “Get Something OFF” Get 2+1 Offer Hurry Only For 1st User!!</strong></span></span></span></span></a></p>
<h2 style="text-align: left;"><strong>Final Verdict</strong></h2>
<p>Now that we are concluding the review, the supplement is 100 percent effective and safe for usage. The Brazilian Wood reviews have already demonstrated that the product is worth the cost.</p>
<p>On top of this, it is backed by a solid money-back guarantee, so you do not have to worry about losing your money either!</p>
<p class="ql-align-center" style="text-align: center;"><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHYxn3NMPQqsmKG54OjkQESM8dw8D7zUXtssdLHaaWSYArzmNucZfEfKCOBsnUqZdp6i-enO0zDWtMGF2pKG2MifoTldIDExJOBDWxicPkSeox29VCmqX6Cz2feNaSfYBnC_BHUdfPT1qUGVgSNyn0NtyKxY-V-M-BDbo5jCOW4qSuxwu3TOTA3dSjIQ/s1600/Screenshot%20(1445).png" alt="" width="320" height="114" /></a></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Terms and Conditions</a></strong><strong> | </strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong>Privacy</strong></a><strong> | </strong><a href="https://www.healthsupplement24x7.com/order-brazilian-wood"><strong>Contact Us</strong></a></span></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong>© 2023 <a href="https://www.healthsupplement24x7.com/order-brazilian-wood">Brazilian Wood</a></strong><strong>. All Rights Reserved.</strong></span></p> |
piyush23111991/CovidTrialData | 2023-09-02T18:36:03.000Z | [
"region:us"
] | piyush23111991 | null | null | null | 0 | 0 | Entry not found |
jxie/qg-tagging-discrete | 2023-09-02T19:24:00.000Z | [
"region:us"
] | jxie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 88455264
num_examples: 1600000
- name: val
num_bytes: 11065570
num_examples: 200000
- name: test
num_bytes: 11058867
num_examples: 200000
download_size: 53537602
dataset_size: 110579701
---
# Dataset Card for "qg-tagging-discrete"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tzzey/guanaco-llama2-1k | 2023-09-02T19:28:03.000Z | [
"region:us"
] | Tzzey | null | null | null | 0 | 0 | Entry not found |
prakhargupta94/recipe | 2023-09-02T19:44:10.000Z | [
"region:us"
] | prakhargupta94 | null | null | null | 0 | 0 | |
E1010836/cegid | 2023-09-05T06:07:41.000Z | [
"license:openrail",
"region:us"
] | E1010836 | null | null | null | 0 | 0 | ---
license: openrail
---
#23/09/02 20:50
Test dataset files created based on the questions from Vianney.
Only two questions.
Train = Test.
|
Mutegoma3/prompt_to_video_generator | 2023-09-02T20:13:59.000Z | [
"region:us"
] | Mutegoma3 | null | null | null | 0 | 0 | Entry not found |
zatepyakin/coyo_3m_hd | 2023-09-04T12:39:12.000Z | [
"license:unknown",
"region:us"
] | zatepyakin | null | null | null | 0 | 0 | ---
license: unknown
---
|
davidscripka/openwakeword_features | 2023-09-04T01:51:44.000Z | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | davidscripka | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
This dataset contains precomputed audio features designed for use with the [openWakeWord library](https://github.com/dscripka/openWakeWord).
Specifically, they are intended to be used as general purpose negative data (that is, data that does *not* contain the target wake word/phrase) for training custom openWakeWord models.
The individual .npy files in this dataset are not original audio data, but rather are low dimensional audio features produced by a pre-trained [speech embedding model from Google](https://tfhub.dev/google/speech_embedding/1).
openWakeWord uses these features as inputs to custom word/phrase detection models.
The dataset currently contains precomputed features from the following datasets.
## ACAV100M
The ACAV100M dataset contains a highly diverse set of audio data with multilingual speech, noise, music, all captured in real-world environments.
This is a highly effective dataset for training custom openwakeword models.
**Dataset source**: https://acav100m.github.io/
**Size**: An array of shape (5625000, 16, 96), corresponding to ~2000 hours of audio from the ACAV100M dataset. Each row in the array has a temporal dimension of 16, which at 80 ms per temporal step results in each row containing features representing 1.28 seconds of audio.
## False-Positive Validation Set
This is a hand-selected combination of audio features (representing ~11 hours of total audio) that serves as a false-positive validation set when training custom openWakeWord models.
It is intended to be broadly representative of the different types of environments where openWakeWord models could be deployed, and thus useful for estimating false-positive rates.
The contributing audio datasets are:
1) The entire [DiPCo](https://www.amazon.science/publications/dipco-dinner-party-corpus) dataset (~5.3 hours)
2) Selected clips from the [Santa Barbara Corpus of Spoken American English](https://www.linguistics.ucsb.edu/research/santa-barbara-corpus) (~3.7 hours)
3) Selected clips from the [MUSDB Music Dataset](https://sigsep.github.io/datasets/musdb.html) (2 hours)
Note that the MUSDB audio data was first reverberated with the [MIT impulse response recordings](https://huggingface.co/datasets/davidscripka/MIT_environmental_impulse_responses) to make it more representative of real-world deployments.
|
gaodrew/thera-250 | 2023-09-02T20:54:01.000Z | [
"license:other",
"region:us"
] | gaodrew | null | null | null | 0 | 0 | ---
license: other
---
|
Tzzey/myabacm_dataset_2 | 2023-09-02T21:00:52.000Z | [
"region:us"
] | Tzzey | null | null | null | 0 | 0 | |
EtherLLM/shitcoins | 2023-09-02T21:17:03.000Z | [
"license:mit",
"region:us"
] | EtherLLM | null | null | null | 0 | 0 | ---
license: mit
---
A dataset on blockchain transactions that will find the appropriate encoder to transform the data into a relational vector database |
open-llm-leaderboard/details_RWKV__rwkv-4-169m-pile | 2023-09-02T21:19:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-169m-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-169m-pile](https://huggingface.co/RWKV/rwkv-4-169m-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-169m-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T21:18:03.608876](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-169m-pile/blob/main/results_2023-09-02T21%3A18%3A03.608876.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23206717779532135,\n\
\ \"acc_stderr\": 0.030752963729471144,\n \"acc_norm\": 0.23329105066054212,\n\
\ \"acc_norm_stderr\": 0.030770587577329072,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602573,\n \"mc2\": 0.41923236334461716,\n\
\ \"mc2_stderr\": 0.014859398619089813\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19112627986348124,\n \"acc_stderr\": 0.011490055292778585,\n\
\ \"acc_norm\": 0.2363481228668942,\n \"acc_norm_stderr\": 0.01241496052430183\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29038040231029677,\n\
\ \"acc_stderr\": 0.004530101869973212,\n \"acc_norm\": 0.31736705835490936,\n\
\ \"acc_norm_stderr\": 0.0046450036620678875\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.035478541985608264,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.035478541985608264\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882921,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882921\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n\
\ \"acc_stderr\": 0.02203721734026784,\n \"acc_norm\": 0.18387096774193548,\n\
\ \"acc_norm_stderr\": 0.02203721734026784\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614775,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614775\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.020473233173551986,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.020473233173551986\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000686,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458047,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458047\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.12962962962962962,\n \"acc_stderr\": 0.02290788315128861,\n \"\
acc_norm\": 0.12962962962962962,\n \"acc_norm_stderr\": 0.02290788315128861\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494043,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494043\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094486,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094486\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\
\ \"acc_stderr\": 0.0151045500089057,\n \"acc_norm\": 0.23243933588761176,\n\
\ \"acc_norm_stderr\": 0.0151045500089057\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225627,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225627\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.023550831351995094,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.023550831351995094\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.02265834408598136,\n\
\ \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.02265834408598136\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064352,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064352\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142314,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602573,\n \"mc2\": 0.41923236334461716,\n\
\ \"mc2_stderr\": 0.014859398619089813\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-169m-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|arc:challenge|25_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hellaswag|10_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T21:18:03.608876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T21:18:03.608876.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T21:18:03.608876.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T21:18:03.608876.parquet'
- config_name: results
data_files:
- split: 2023_09_02T21_18_03.608876
path:
- results_2023-09-02T21:18:03.608876.parquet
- split: latest
path:
- results_2023-09-02T21:18:03.608876.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-169m-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-169m-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-169m-pile](https://huggingface.co/RWKV/rwkv-4-169m-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-169m-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T21:18:03.608876](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-169m-pile/blob/main/results_2023-09-02T21%3A18%3A03.608876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23206717779532135,
"acc_stderr": 0.030752963729471144,
"acc_norm": 0.23329105066054212,
"acc_norm_stderr": 0.030770587577329072,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602573,
"mc2": 0.41923236334461716,
"mc2_stderr": 0.014859398619089813
},
"harness|arc:challenge|25": {
"acc": 0.19112627986348124,
"acc_stderr": 0.011490055292778585,
"acc_norm": 0.2363481228668942,
"acc_norm_stderr": 0.01241496052430183
},
"harness|hellaswag|10": {
"acc": 0.29038040231029677,
"acc_stderr": 0.004530101869973212,
"acc_norm": 0.31736705835490936,
"acc_norm_stderr": 0.0046450036620678875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.035478541985608264,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.035478541985608264
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882921,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882921
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.027719315709614775,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.027719315709614775
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.020473233173551986,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.020473233173551986
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000686,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458047,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.12962962962962962,
"acc_stderr": 0.02290788315128861,
"acc_norm": 0.12962962962962962,
"acc_norm_stderr": 0.02290788315128861
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494043,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494043
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094486,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094486
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.0151045500089057,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.0151045500089057
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225627,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.023550831351995094,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.023550831351995094
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20987654320987653,
"acc_stderr": 0.02265834408598136,
"acc_norm": 0.20987654320987653,
"acc_norm_stderr": 0.02265834408598136
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064352,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064352
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142314,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602573,
"mc2": 0.41923236334461716,
"mc2_stderr": 0.014859398619089813
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DigitalAiWolf/Zabivaka | 2023-09-02T21:27:52.000Z | [
"region:us"
] | DigitalAiWolf | null | null | null | 0 | 0 | Entry not found |
Ketro/TRIAL | 2023-09-02T21:32:20.000Z | [
"region:us"
] | Ketro | null | null | null | 0 | 0 | Entry not found |
judy93536/significant_titles | 2023-09-02T21:57:51.000Z | [
"region:us"
] | judy93536 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 435957
num_examples: 5229
download_size: 0
dataset_size: 435957
---
# Dataset Card for "significant_titles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ImagenHub/Multi_Subject_Driven_Image_Generation | 2023-10-05T18:28:31.000Z | [
"arxiv:2310.01596",
"region:us"
] | ImagenHub | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: concept1
dtype: string
- name: concept2
dtype: string
- name: uid
dtype: int64
splits:
- name: train
num_bytes: 7408
num_examples: 102
download_size: 4243
dataset_size: 7408
---
# Dataset Card
Dataset in [ImagenHub](arxiv.org/abs/2310.01596).
# Citation
Please kindly cite our paper if you use our code, data, models or results:
```
@article{ku2023imagenhub,
title={ImagenHub: Standardizing the evaluation of conditional image generation models},
author={Max Ku, Tianle Li, Kai Zhang, Yujie Lu, Xingyu Fu, Wenwen Zhuang, Wenhu Chen},
journal={arXiv preprint arXiv:2310.01596},
year={2023}
}
``` |
JihyukKim/eli5-subquestion-d3-paired | 2023-09-02T22:13:26.000Z | [
"region:us"
] | JihyukKim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: gold_claims
sequence: string
- name: response_j_claims
sequence: string
- name: response_k_claims
sequence: string
splits:
- name: train
num_bytes: 4137966
num_examples: 3231
- name: test
num_bytes: 61057
num_examples: 49
download_size: 938797
dataset_size: 4199023
---
# Dataset Card for "eli5-subquestion-d3-paired"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JihyukKim/eli5-subquestion-d3-paired-sft | 2023-09-02T22:16:52.000Z | [
"region:us"
] | JihyukKim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: gold_claims
sequence: string
- name: response_j_claims
sequence: string
- name: response_k_claims
sequence: string
splits:
- name: train
num_bytes: 2066181
num_examples: 1638
- name: test
num_bytes: 32902
num_examples: 27
download_size: 753524
dataset_size: 2099083
---
# Dataset Card for "eli5-subquestion-d3-paired-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_house_16H_gosdt_l512_d3_sd1 | 2023-09-02T22:18:28.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 9224800000
num_examples: 100000
- name: validation
num_bytes: 922480000
num_examples: 10000
download_size: 3196694951
dataset_size: 10147280000
---
# Dataset Card for "autotree_automl_house_16H_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thagmrs/costumer_qa | 2023-09-02T23:39:26.000Z | [
"license:unknown",
"region:us"
] | thagmrs | null | null | null | 0 | 0 | ---
license: unknown
---
|
Dippi9845/interval_tree_arxiv_long | 2023-09-02T23:09:32.000Z | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | Dippi9845 | null | null | null | 0 | 0 | ---
license: cc-by-nc-nd-4.0
---
|
aquillesdaamizade/pony-head | 2023-09-02T23:14:41.000Z | [
"license:other",
"region:us"
] | aquillesdaamizade | null | null | null | 0 | 0 | ---
license: other
---
|
DigitalAiWolf/LeoSaitama | 2023-09-02T23:19:15.000Z | [
"region:us"
] | DigitalAiWolf | null | null | null | 0 | 0 | Entry not found |
ahmadandzohreh1/brain | 2023-09-03T00:25:44.000Z | [
"region:us"
] | ahmadandzohreh1 | null | null | null | 0 | 0 | Entry not found |
Roscall/Elvis50v2 | 2023-09-03T00:37:04.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
liyucheng/novel_metaphor | 2023-09-03T02:47:37.000Z | [
"region:us"
] | liyucheng | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: lemmas
sequence: string
- name: poses
sequence: string
- name: metaphor_classes
sequence: int64
- name: novel_score
sequence: float64
splits:
- name: train
num_bytes: 17600252
num_examples: 32036
download_size: 3437305
dataset_size: 17600252
---
# Dataset Card for "novel_metaphor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hammerjohn/small-twitter-financial-news-topic | 2023-09-03T02:35:01.000Z | [
"region:us"
] | hammerjohn | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 113985
num_examples: 800
- name: validation
num_bytes: 29748.311877580763
num_examples: 200
download_size: 100302
dataset_size: 143733.31187758077
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "small-twitter-financial-news-topic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmsolson75/artist_song_lyric_dataset | 2023-09-03T01:43:51.000Z | [
"license:apache-2.0",
"region:us"
] | cmsolson75 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Admin08077/U2 | 2023-09-03T07:41:40.000Z | [
"doi:10.57967/hf/1067",
"region:us"
] | Admin08077 | null | null | null | 0 | 0 | dataset: *.* |
Ailyth/Playstation4GamePatchData | 2023-10-03T15:26:39.000Z | [
"license:mit",
"region:us"
] | Ailyth | null | null | null | 0 | 0 | ---
license: mit
---
This dataset records the update information of all PS4 games, including CUSA ID, game title, and the latest version. Games that have never released patches (i.e., version 1.0) are not included in this dataset. |
anhtu12st/papers | 2023-09-03T02:13:23.000Z | [
"region:us"
] | anhtu12st | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 45435
num_examples: 155
download_size: 28085
dataset_size: 45435
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "papers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
something009/nekoland | 2023-09-03T02:17:28.000Z | [
"region:us"
] | something009 | null | null | null | 0 | 0 | Entry not found |
sarthakpadhi2016/code-llama-spider-1k | 2023-09-03T02:28:02.000Z | [
"region:us"
] | sarthakpadhi2016 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1902503
num_examples: 1000
download_size: 514871
dataset_size: 1902503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code-llama-spider-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DigitalAiWolf/Werehog | 2023-09-03T02:34:36.000Z | [
"region:us"
] | DigitalAiWolf | null | null | null | 0 | 0 | Entry not found |
clara659/wordhu | 2023-09-03T03:02:47.000Z | [
"region:us"
] | clara659 | null | null | null | 0 | 0 | Entry not found |
wordhu/movhu | 2023-09-03T03:45:03.000Z | [
"region:us"
] | wordhu | null | null | null | 0 | 0 | Entry not found |
thanhnew2001/funny | 2023-09-03T04:30:19.000Z | [
"region:us"
] | thanhnew2001 | null | null | null | 0 | 0 | Entry not found |
BigSuperbPrivate/DialogueActPairing_DailyTalk | 2023-09-04T13:49:33.000Z | [
"region:us"
] | BigSuperbPrivate | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: file2
dtype: string
- name: audio2
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 5775513985.0
num_examples: 10000
- name: validation
num_bytes: 1225208055.0
num_examples: 2000
download_size: 6514124261
dataset_size: 7000722040.0
---
# Dataset Card for "DialogueActPairing_DailyTalk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor | 2023-09-03T04:30:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/test_qkvo_adptor
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/test_qkvo_adptor](https://huggingface.co/xxyyy123/test_qkvo_adptor)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T04:28:57.572800](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor/blob/main/results_2023-09-03T04%3A28%3A57.572800.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5181817055404072,\n\
\ \"acc_stderr\": 0.03503876216805227,\n \"acc_norm\": 0.5217050459960606,\n\
\ \"acc_norm_stderr\": 0.0350240051469643,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5352609389838828,\n\
\ \"mc2_stderr\": 0.015584516858746202\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.014575583922019669,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6007767377016531,\n\
\ \"acc_stderr\": 0.004887378682406531,\n \"acc_norm\": 0.7898824935271859,\n\
\ \"acc_norm_stderr\": 0.004065592811695945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n\
\ \"acc_stderr\": 0.028206225591502744,\n \"acc_norm\": 0.5645161290322581,\n\
\ \"acc_norm_stderr\": 0.028206225591502744\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999933,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845454,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845454\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493388,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493388\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.032437180551374116,\n\
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.032437180551374116\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.019227468876463507,\n \"\
acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.019227468876463507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801714,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801714\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7164750957854407,\n\
\ \"acc_stderr\": 0.01611731816683227,\n \"acc_norm\": 0.7164750957854407,\n\
\ \"acc_norm_stderr\": 0.01611731816683227\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.026756255129663765,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.026756255129663765\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331146,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331146\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422708,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n\
\ \"acc_stderr\": 0.012376459593894398,\n \"acc_norm\": 0.3767926988265971,\n\
\ \"acc_norm_stderr\": 0.012376459593894398\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n\
\ \"acc_stderr\": 0.030789051139030806,\n \"acc_norm\": 0.636734693877551,\n\
\ \"acc_norm_stderr\": 0.030789051139030806\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.034875586404620636,\n\
\ \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.034875586404620636\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.0378913442461155,\n\
\ \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.0378913442461155\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n\
\ \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.7134502923976608,\n\
\ \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n\
\ \"mc2\": 0.5352609389838828,\n \"mc2_stderr\": 0.015584516858746202\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/test_qkvo_adptor
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:28:57.572800.parquet'
- config_name: results
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- results_2023-09-03T04:28:57.572800.parquet
- split: latest
path:
- results_2023-09-03T04:28:57.572800.parquet
---
# Dataset Card for Evaluation run of xxyyy123/test_qkvo_adptor
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/test_qkvo_adptor
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/test_qkvo_adptor](https://huggingface.co/xxyyy123/test_qkvo_adptor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T04:28:57.572800](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor/blob/main/results_2023-09-03T04%3A28%3A57.572800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5181817055404072,
"acc_stderr": 0.03503876216805227,
"acc_norm": 0.5217050459960606,
"acc_norm_stderr": 0.0350240051469643,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5352609389838828,
"mc2_stderr": 0.015584516858746202
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.014575583922019669,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.6007767377016531,
"acc_stderr": 0.004887378682406531,
"acc_norm": 0.7898824935271859,
"acc_norm_stderr": 0.004065592811695945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502744,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502744
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999933,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845454,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845454
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493388,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493388
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.032437180551374116,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.032437180551374116
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.019227468876463507,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.019227468876463507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801714,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801714
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7164750957854407,
"acc_stderr": 0.01611731816683227,
"acc_norm": 0.7164750957854407,
"acc_norm_stderr": 0.01611731816683227
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.026756255129663765,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.026756255129663765
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331146,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331146
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422708,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596143,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596143
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894398,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894398
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5352609389838828,
"mc2_stderr": 0.015584516858746202
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cmsolson75/LLM_artist_lyrics | 2023-09-03T04:48:46.000Z | [
"license:apache-2.0",
"region:us"
] | cmsolson75 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
tori29umai/CounterfeitXL-V1.0_canny_noline_dataset | 2023-09-03T05:00:53.000Z | [
"license:openrail",
"region:us"
] | tori29umai | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_RWKV__rwkv-4-430m-pile | 2023-09-03T04:51:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-430m-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-430m-pile](https://huggingface.co/RWKV/rwkv-4-430m-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-430m-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T04:50:41.719497](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-430m-pile/blob/main/results_2023-09-03T04%3A50%3A41.719497.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24979813607282328,\n\
\ \"acc_stderr\": 0.03133246449436127,\n \"acc_norm\": 0.2514324934989431,\n\
\ \"acc_norm_stderr\": 0.0313438236936013,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396726,\n \"mc2\": 0.39582218824731835,\n\
\ \"mc2_stderr\": 0.014672961093755748\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2380546075085324,\n \"acc_stderr\": 0.012445770028026203,\n\
\ \"acc_norm\": 0.26706484641638223,\n \"acc_norm_stderr\": 0.01292893319649635\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33270264887472617,\n\
\ \"acc_stderr\": 0.004702181042215901,\n \"acc_norm\": 0.4001194981079466,\n\
\ \"acc_norm_stderr\": 0.004889210628907947\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310053,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310053\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.02899033125251624,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.02899033125251624\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279458,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279458\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n\
\ \"acc_stderr\": 0.027790177064383605,\n \"acc_norm\": 0.21973094170403587,\n\
\ \"acc_norm_stderr\": 0.027790177064383605\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467765,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467765\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417625,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417625\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.01532988894089988,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.01532988894089988\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261436,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261436\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087866,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2057877813504823,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.2057877813504823,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.01095655665441736,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.01095655665441736\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.26838235294117646,\n \"acc_stderr\": 0.026917481224377225,\n\
\ \"acc_norm\": 0.26838235294117646,\n \"acc_norm_stderr\": 0.026917481224377225\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407315,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407315\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396726,\n \"mc2\": 0.39582218824731835,\n\
\ \"mc2_stderr\": 0.014672961093755748\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-430m-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:50:41.719497.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:50:41.719497.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:50:41.719497.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:50:41.719497.parquet'
- config_name: results
data_files:
- split: 2023_09_03T04_50_41.719497
path:
- results_2023-09-03T04:50:41.719497.parquet
- split: latest
path:
- results_2023-09-03T04:50:41.719497.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-430m-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-430m-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-430m-pile](https://huggingface.co/RWKV/rwkv-4-430m-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-430m-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T04:50:41.719497](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-430m-pile/blob/main/results_2023-09-03T04%3A50%3A41.719497.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24979813607282328,
"acc_stderr": 0.03133246449436127,
"acc_norm": 0.2514324934989431,
"acc_norm_stderr": 0.0313438236936013,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396726,
"mc2": 0.39582218824731835,
"mc2_stderr": 0.014672961093755748
},
"harness|arc:challenge|25": {
"acc": 0.2380546075085324,
"acc_stderr": 0.012445770028026203,
"acc_norm": 0.26706484641638223,
"acc_norm_stderr": 0.01292893319649635
},
"harness|hellaswag|10": {
"acc": 0.33270264887472617,
"acc_stderr": 0.004702181042215901,
"acc_norm": 0.4001194981079466,
"acc_norm_stderr": 0.004889210628907947
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310053,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310053
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.02899033125251624,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.02899033125251624
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279458,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279458
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21973094170403587,
"acc_stderr": 0.027790177064383605,
"acc_norm": 0.21973094170403587,
"acc_norm_stderr": 0.027790177064383605
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467765,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467765
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417625,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417625
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.01532988894089988,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.01532988894089988
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261436,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261436
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2057877813504823,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.2057877813504823,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.01095655665441736,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.01095655665441736
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.26838235294117646,
"acc_stderr": 0.026917481224377225,
"acc_norm": 0.26838235294117646,
"acc_norm_stderr": 0.026917481224377225
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396726,
"mc2": 0.39582218824731835,
"mc2_stderr": 0.014672961093755748
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dongyoung4091/hh-rlhf_with_features_rx | 2023-09-03T04:57:41.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: model_A_chosen
dtype: float64
- name: model_A_rejected
dtype: float64
- name: model_B_chosen
dtype: float64
- name: model_B_rejected
dtype: float64
- name: external_rm1_chosen
dtype: float64
- name: external_rm1_rejected
dtype: float64
- name: external_rm2_chosen
dtype: float64
- name: external_rm2_rejected
dtype: float64
- name: RM_readability_chosen
dtype: float64
- name: RM_readability_rejected
dtype: float64
- name: RM_fail-to-consider-context_chosen
dtype: float64
- name: RM_fail-to-consider-context_rejected
dtype: float64
- name: RM_enough-detail_chosen
dtype: float64
- name: RM_enough-detail_rejected
dtype: float64
- name: zeroshot_helpfulness_chosen
dtype: float64
- name: zeroshot_specificity_chosen
dtype: float64
- name: zeroshot_intent_chosen
dtype: float64
- name: zeroshot_factuality_chosen
dtype: float64
- name: zeroshot_easy-to-understand_chosen
dtype: float64
- name: zeroshot_relevance_chosen
dtype: float64
- name: zeroshot_readability_chosen
dtype: float64
- name: zeroshot_enough-detail_chosen
dtype: float64
- name: zeroshot_biased:_chosen
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_chosen
dtype: float64
- name: zeroshot_repetetive_chosen
dtype: float64
- name: zeroshot_fail-to-consider-context_chosen
dtype: float64
- name: zeroshot_too-long_chosen
dtype: float64
- name: zeroshot_helpfulness_rejected
dtype: float64
- name: zeroshot_specificity_rejected
dtype: float64
- name: zeroshot_intent_rejected
dtype: float64
- name: zeroshot_factuality_rejected
dtype: float64
- name: zeroshot_easy-to-understand_rejected
dtype: float64
- name: zeroshot_relevance_rejected
dtype: float64
- name: zeroshot_readability_rejected
dtype: float64
- name: zeroshot_enough-detail_rejected
dtype: float64
- name: zeroshot_biased:_rejected
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_rejected
dtype: float64
- name: zeroshot_repetetive_rejected
dtype: float64
- name: zeroshot_fail-to-consider-context_rejected
dtype: float64
- name: zeroshot_too-long_rejected
dtype: float64
splits:
- name: train
num_bytes: 11643573
num_examples: 9574
- name: test
num_bytes: 11614860
num_examples: 9574
download_size: 13958714
dataset_size: 23258433
---
# Dataset Card for "hh-rlhf_with_features_rx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YxBxRyXJx/QEU_dataset | 2023-09-03T05:06:49.000Z | [
"license:apache-2.0",
"region:us"
] | YxBxRyXJx | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile | 2023-09-03T05:10:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-1b5-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-1b5-pile](https://huggingface.co/RWKV/rwkv-4-1b5-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T05:09:25.053810](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile/blob/main/results_2023-09-03T05%3A09%3A25.053810.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2605288864903845,\n\
\ \"acc_stderr\": 0.03169920278335169,\n \"acc_norm\": 0.26325783964965394,\n\
\ \"acc_norm_stderr\": 0.03170990565366411,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.0144508467141239,\n \"mc2\": 0.35795696104127084,\n\
\ \"mc2_stderr\": 0.013727456216843162\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2764505119453925,\n \"acc_stderr\": 0.013069662474252425,\n\
\ \"acc_norm\": 0.318259385665529,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4033061143198566,\n\
\ \"acc_stderr\": 0.00489558632940132,\n \"acc_norm\": 0.5225054769966142,\n\
\ \"acc_norm_stderr\": 0.004984724235115118\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952925,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952925\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408166,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408166\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539345,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539345\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.033333333333333375,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.033333333333333375\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511783,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511783\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267063,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.021685546665333205,\n\
\ \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.021685546665333205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007225,\n \"\
acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23119266055045873,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891172,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891172\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803838,\n\
\ \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803838\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.01489339173524959,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.01489339173524959\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.02472386150477169,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.02472386150477169\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"\
acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.011285033165551281,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.011285033165551281\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538795,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.0144508467141239,\n \"mc2\": 0.35795696104127084,\n\
\ \"mc2_stderr\": 0.013727456216843162\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-1b5-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:09:25.053810.parquet'
- config_name: results
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- results_2023-09-03T05:09:25.053810.parquet
- split: latest
path:
- results_2023-09-03T05:09:25.053810.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-1b5-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-1b5-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-1b5-pile](https://huggingface.co/RWKV/rwkv-4-1b5-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T05:09:25.053810](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile/blob/main/results_2023-09-03T05%3A09%3A25.053810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2605288864903845,
"acc_stderr": 0.03169920278335169,
"acc_norm": 0.26325783964965394,
"acc_norm_stderr": 0.03170990565366411,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.0144508467141239,
"mc2": 0.35795696104127084,
"mc2_stderr": 0.013727456216843162
},
"harness|arc:challenge|25": {
"acc": 0.2764505119453925,
"acc_stderr": 0.013069662474252425,
"acc_norm": 0.318259385665529,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.4033061143198566,
"acc_stderr": 0.00489558632940132,
"acc_norm": 0.5225054769966142,
"acc_norm_stderr": 0.004984724235115118
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952925,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952925
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408166,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408166
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539345,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539345
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.033333333333333375,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.033333333333333375
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511783,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511783
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267063,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.021685546665333205,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.021685546665333205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007225,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891172,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891172
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803838,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803838
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.01489339173524959,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.01489339173524959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.02472386150477169,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.02472386150477169
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.011285033165551281,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.011285033165551281
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538795,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.0144508467141239,
"mc2": 0.35795696104127084,
"mc2_stderr": 0.013727456216843162
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj | 2023-09-03T05:21:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T05:20:14.306293](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj/blob/main/results_2023-09-03T05%3A20%3A14.306293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5593409440792578,\n\
\ \"acc_stderr\": 0.03426348217406878,\n \"acc_norm\": 0.5635899001067951,\n\
\ \"acc_norm_stderr\": 0.03424362957422974,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.3993091613948094,\n\
\ \"mc2_stderr\": 0.014079610343632127\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294321,\n\
\ \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.614618601872137,\n\
\ \"acc_stderr\": 0.004856906473719381,\n \"acc_norm\": 0.8226448914558853,\n\
\ \"acc_norm_stderr\": 0.0038118830709112663\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.044629175353369355,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.044629175353369355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752052,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752052\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416406,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416406\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918228,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937158,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937158\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475347,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475347\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600653,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.02736807824397164,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.02736807824397164\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001872,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778862,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313617,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313617\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.020054269200726463,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.020054269200726463\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.3993091613948094,\n\
\ \"mc2_stderr\": 0.014079610343632127\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:20:14.306293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:20:14.306293.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:20:14.306293.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:20:14.306293.parquet'
- config_name: results
data_files:
- split: 2023_09_03T05_20_14.306293
path:
- results_2023-09-03T05:20:14.306293.parquet
- split: latest
path:
- results_2023-09-03T05:20:14.306293.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T05:20:14.306293](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-gate_up_down_proj/blob/main/results_2023-09-03T05%3A20%3A14.306293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5593409440792578,
"acc_stderr": 0.03426348217406878,
"acc_norm": 0.5635899001067951,
"acc_norm_stderr": 0.03424362957422974,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834562,
"mc2": 0.3993091613948094,
"mc2_stderr": 0.014079610343632127
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294321,
"acc_norm": 0.5716723549488054,
"acc_norm_stderr": 0.014460496367599012
},
"harness|hellaswag|10": {
"acc": 0.614618601872137,
"acc_stderr": 0.004856906473719381,
"acc_norm": 0.8226448914558853,
"acc_norm_stderr": 0.0038118830709112663
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.044629175353369355,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.044629175353369355
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752052,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752052
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416406,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918228,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937158,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937158
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475347,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475347
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600653,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397164,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397164
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001872,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778862,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313617,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313617
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.020054269200726463,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.020054269200726463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834562,
"mc2": 0.3993091613948094,
"mc2_stderr": 0.014079610343632127
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_RWKV__rwkv-raven-1b5 | 2023-09-03T05:22:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-raven-1b5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-raven-1b5](https://huggingface.co/RWKV/rwkv-raven-1b5) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-raven-1b5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T05:21:18.307582](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-1b5/blob/main/results_2023-09-03T05%3A21%3A18.307582.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2624381073861863,\n\
\ \"acc_stderr\": 0.03192172274184596,\n \"acc_norm\": 0.2650819779663321,\n\
\ \"acc_norm_stderr\": 0.031931036762425556,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871117,\n \"mc2\": 0.37088025961521126,\n\
\ \"mc2_stderr\": 0.013858714584040949\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2815699658703072,\n \"acc_stderr\": 0.013143376735009017,\n\
\ \"acc_norm\": 0.318259385665529,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40669189404501094,\n\
\ \"acc_stderr\": 0.004902125388002211,\n \"acc_norm\": 0.5259908384783908,\n\
\ \"acc_norm_stderr\": 0.004983035420235711\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.039446241625011175,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.039446241625011175\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756193,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756193\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02850485647051418,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02850485647051418\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813386,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.0360010569272777,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.0360010569272777\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790607,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790607\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.02547019683590005,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.02547019683590005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.032752644677915145,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.032752644677915145\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423077,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423077\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23119266055045873,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176852,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176852\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849303,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849303\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.20245398773006135,\n \"acc_stderr\": 0.031570650789119026,\n\
\ \"acc_norm\": 0.20245398773006135,\n \"acc_norm_stderr\": 0.031570650789119026\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283164,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283164\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32905982905982906,\n\
\ \"acc_stderr\": 0.030782321577688163,\n \"acc_norm\": 0.32905982905982906,\n\
\ \"acc_norm_stderr\": 0.030782321577688163\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.015438083080568963,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.015438083080568963\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010102,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n\
\ \"acc_stderr\": 0.023720088516179034,\n \"acc_norm\": 0.22508038585209003,\n\
\ \"acc_norm_stderr\": 0.023720088516179034\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843024,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843024\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2646675358539765,\n\
\ \"acc_stderr\": 0.011267332992845528,\n \"acc_norm\": 0.2646675358539765,\n\
\ \"acc_norm_stderr\": 0.011267332992845528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23897058823529413,\n \"acc_stderr\": 0.025905280644893006,\n\
\ \"acc_norm\": 0.23897058823529413,\n \"acc_norm_stderr\": 0.025905280644893006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784606,\n\
\ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.29850746268656714,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.29850746268656714,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871117,\n \"mc2\": 0.37088025961521126,\n\
\ \"mc2_stderr\": 0.013858714584040949\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-raven-1b5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:21:18.307582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:21:18.307582.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:21:18.307582.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:21:18.307582.parquet'
- config_name: results
data_files:
- split: 2023_09_03T05_21_18.307582
path:
- results_2023-09-03T05:21:18.307582.parquet
- split: latest
path:
- results_2023-09-03T05:21:18.307582.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-raven-1b5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-raven-1b5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-raven-1b5](https://huggingface.co/RWKV/rwkv-raven-1b5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-raven-1b5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T05:21:18.307582](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-1b5/blob/main/results_2023-09-03T05%3A21%3A18.307582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2624381073861863,
"acc_stderr": 0.03192172274184596,
"acc_norm": 0.2650819779663321,
"acc_norm_stderr": 0.031931036762425556,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871117,
"mc2": 0.37088025961521126,
"mc2_stderr": 0.013858714584040949
},
"harness|arc:challenge|25": {
"acc": 0.2815699658703072,
"acc_stderr": 0.013143376735009017,
"acc_norm": 0.318259385665529,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.40669189404501094,
"acc_stderr": 0.004902125388002211,
"acc_norm": 0.5259908384783908,
"acc_norm_stderr": 0.004983035420235711
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.039446241625011175,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.039446241625011175
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02850485647051418,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02850485647051418
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813386,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.0360010569272777,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.0360010569272777
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790607,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790607
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.02547019683590005,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.02547019683590005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.032752644677915145,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.032752644677915145
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423077,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423077
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176852,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176852
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849303,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849303
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.20245398773006135,
"acc_stderr": 0.031570650789119026,
"acc_norm": 0.20245398773006135,
"acc_norm_stderr": 0.031570650789119026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283164,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283164
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32905982905982906,
"acc_stderr": 0.030782321577688163,
"acc_norm": 0.32905982905982906,
"acc_norm_stderr": 0.030782321577688163
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.015438083080568963,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.015438083080568963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010102,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179034,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179034
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843024,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843024
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2646675358539765,
"acc_stderr": 0.011267332992845528,
"acc_norm": 0.2646675358539765,
"acc_norm_stderr": 0.011267332992845528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23897058823529413,
"acc_stderr": 0.025905280644893006,
"acc_norm": 0.23897058823529413,
"acc_norm_stderr": 0.025905280644893006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784606,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.29850746268656714,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.29850746268656714,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871117,
"mc2": 0.37088025961521126,
"mc2_stderr": 0.013858714584040949
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MouseTrap/maow_maow_dataset_v2 | 2023-09-04T00:22:58.000Z | [
"region:us"
] | MouseTrap | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_vihangd__smartplat-3b-v3 | 2023-09-03T05:58:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of vihangd/smartplat-3b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vihangd/smartplat-3b-v3](https://huggingface.co/vihangd/smartplat-3b-v3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartplat-3b-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T05:57:03.660759](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartplat-3b-v3/blob/main/results_2023-09-03T05%3A57%3A03.660759.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25119472031775575,\n\
\ \"acc_stderr\": 0.031235901958333895,\n \"acc_norm\": 0.25488385371245836,\n\
\ \"acc_norm_stderr\": 0.031232145065757902,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602576,\n \"mc2\": 0.36306814270012927,\n\
\ \"mc2_stderr\": 0.013779496158046956\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3651877133105802,\n \"acc_stderr\": 0.014070265519268804,\n\
\ \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946707\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5286795459071898,\n\
\ \"acc_stderr\": 0.004981566295189443,\n \"acc_norm\": 0.7122087233618801,\n\
\ \"acc_norm_stderr\": 0.004518080594528022\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313141,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313141\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118366,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788137,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788137\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20634920634920634,\n \"acc_stderr\": 0.020842290930114683,\n \"\
acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.020842290930114683\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n\
\ \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417393,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655068,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655068\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.028359620870533953,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.028359620870533953\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.018224078117299085,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.018224078117299085\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.02483717351824239,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.02483717351824239\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \"\
acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467763,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467763\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278433,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.022183477668412853,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.022183477668412853\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034963,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559345,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.02671143055553842,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.02671143055553842\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602576,\n \"mc2\": 0.36306814270012927,\n\
\ \"mc2_stderr\": 0.013779496158046956\n }\n}\n```"
repo_url: https://huggingface.co/vihangd/smartplat-3b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:57:03.660759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:57:03.660759.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:57:03.660759.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:57:03.660759.parquet'
- config_name: results
data_files:
- split: 2023_09_03T05_57_03.660759
path:
- results_2023-09-03T05:57:03.660759.parquet
- split: latest
path:
- results_2023-09-03T05:57:03.660759.parquet
---
# Dataset Card for Evaluation run of vihangd/smartplat-3b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vihangd/smartplat-3b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [vihangd/smartplat-3b-v3](https://huggingface.co/vihangd/smartplat-3b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartplat-3b-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T05:57:03.660759](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartplat-3b-v3/blob/main/results_2023-09-03T05%3A57%3A03.660759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25119472031775575,
"acc_stderr": 0.031235901958333895,
"acc_norm": 0.25488385371245836,
"acc_norm_stderr": 0.031232145065757902,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602576,
"mc2": 0.36306814270012927,
"mc2_stderr": 0.013779496158046956
},
"harness|arc:challenge|25": {
"acc": 0.3651877133105802,
"acc_stderr": 0.014070265519268804,
"acc_norm": 0.3993174061433447,
"acc_norm_stderr": 0.014312094557946707
},
"harness|hellaswag|10": {
"acc": 0.5286795459071898,
"acc_stderr": 0.004981566295189443,
"acc_norm": 0.7122087233618801,
"acc_norm_stderr": 0.004518080594528022
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313141,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313141
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118366,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788137,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788137
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.020842290930114683,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.020842290930114683
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417393,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655068,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655068
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.028359620870533953,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.028359620870533953
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.018224078117299085,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.018224078117299085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.02483717351824239,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.02483717351824239
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467763,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467763
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278433,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.022183477668412853,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.022183477668412853
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034963,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.02671143055553842,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.02671143055553842
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602576,
"mc2": 0.36306814270012927,
"mc2_stderr": 0.013779496158046956
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
saritha123/sari-567 | 2023-09-03T06:17:24.000Z | [
"license:openrail",
"region:us"
] | saritha123 | null | null | null | 0 | 0 | ---
license: openrail
---
|
artdwn/dataset | 2023-09-03T12:55:53.000Z | [
"region:us"
] | artdwn | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Devio__test-1400 | 2023-09-03T06:26:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Devio/test-1400
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/test-1400](https://huggingface.co/Devio/test-1400) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test-1400\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T06:25:15.872451](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-1400/blob/main/results_2023-09-03T06%3A25%3A15.872451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29066385939253414,\n\
\ \"acc_stderr\": 0.032634153881095015,\n \"acc_norm\": 0.2942628467289629,\n\
\ \"acc_norm_stderr\": 0.03263364427629342,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.3686966632375142,\n\
\ \"mc2_stderr\": 0.014163025545486835\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.013960142600598685,\n\
\ \"acc_norm\": 0.38139931740614336,\n \"acc_norm_stderr\": 0.014194389086685263\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4785899223262298,\n\
\ \"acc_stderr\": 0.004985204766555062,\n \"acc_norm\": 0.6619199362676758,\n\
\ \"acc_norm_stderr\": 0.004720891597174716\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.036333844140734636,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.036333844140734636\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3169811320754717,\n \"acc_stderr\": 0.028637235639800935,\n\
\ \"acc_norm\": 0.3169811320754717,\n \"acc_norm_stderr\": 0.028637235639800935\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730564,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3225806451612903,\n\
\ \"acc_stderr\": 0.026593084516572274,\n \"acc_norm\": 0.3225806451612903,\n\
\ \"acc_norm_stderr\": 0.026593084516572274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3686868686868687,\n \"acc_stderr\": 0.034373055019806184,\n \"\
acc_norm\": 0.3686868686868687,\n \"acc_norm_stderr\": 0.034373055019806184\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.02435958146539698,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.02435958146539698\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513536,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3376146788990826,\n \"acc_stderr\": 0.0202752659866389,\n \"acc_norm\"\
: 0.3376146788990826,\n \"acc_norm_stderr\": 0.0202752659866389\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n\
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.19831223628691982,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.19831223628691982,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.31297709923664124,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.31297709923664124,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.1322314049586777,\n \"acc_stderr\": 0.030922788320445784,\n \"\
acc_norm\": 0.1322314049586777,\n \"acc_norm_stderr\": 0.030922788320445784\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934722,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.015384352284543932,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.015384352284543932\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265023,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265023\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178479,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178479\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500347,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500347\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233136,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233136\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.3686966632375142,\n\
\ \"mc2_stderr\": 0.014163025545486835\n }\n}\n```"
repo_url: https://huggingface.co/Devio/test-1400
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|arc:challenge|25_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hellaswag|10_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T06:25:15.872451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T06:25:15.872451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T06:25:15.872451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T06:25:15.872451.parquet'
- config_name: results
data_files:
- split: 2023_09_03T06_25_15.872451
path:
- results_2023-09-03T06:25:15.872451.parquet
- split: latest
path:
- results_2023-09-03T06:25:15.872451.parquet
---
# Dataset Card for Evaluation run of Devio/test-1400
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/test-1400
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/test-1400](https://huggingface.co/Devio/test-1400) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__test-1400",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T06:25:15.872451](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-1400/blob/main/results_2023-09-03T06%3A25%3A15.872451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29066385939253414,
"acc_stderr": 0.032634153881095015,
"acc_norm": 0.2942628467289629,
"acc_norm_stderr": 0.03263364427629342,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3686966632375142,
"mc2_stderr": 0.014163025545486835
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.013960142600598685,
"acc_norm": 0.38139931740614336,
"acc_norm_stderr": 0.014194389086685263
},
"harness|hellaswag|10": {
"acc": 0.4785899223262298,
"acc_stderr": 0.004985204766555062,
"acc_norm": 0.6619199362676758,
"acc_norm_stderr": 0.004720891597174716
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.036333844140734636,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.036333844140734636
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3169811320754717,
"acc_stderr": 0.028637235639800935,
"acc_norm": 0.3169811320754717,
"acc_norm_stderr": 0.028637235639800935
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730564,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3225806451612903,
"acc_stderr": 0.026593084516572274,
"acc_norm": 0.3225806451612903,
"acc_norm_stderr": 0.026593084516572274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3686868686868687,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.3686868686868687,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.02435958146539698,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.02435958146539698
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.0202752659866389,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.0202752659866389
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.19831223628691982,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.19831223628691982,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.31297709923664124,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.31297709923664124,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.1322314049586777,
"acc_stderr": 0.030922788320445784,
"acc_norm": 0.1322314049586777,
"acc_norm_stderr": 0.030922788320445784
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934722,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.4077669902912621,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.4077669902912621,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.015384352284543932,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.015384352284543932
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369922,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369922
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178479,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500347,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500347
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233136,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233136
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3686966632375142,
"mc2_stderr": 0.014163025545486835
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Lilsunx/saritha | 2023-09-03T06:48:26.000Z | [
"license:openrail",
"region:us"
] | Lilsunx | null | null | null | 0 | 0 | ---
license: openrail
---
|
Hosioka/WDXL_Aesthetic_Datasets_WIP | 2023-09-03T07:04:15.000Z | [
"license:openrail",
"region:us"
] | Hosioka | null | null | null | 0 | 0 | ---
license: openrail
---
|
msilich/test-openassistant-guanaco | 2023-09-03T07:14:25.000Z | [
"region:us"
] | msilich | null | null | null | 0 | 0 | Entry not found |
sudiptabasak/expressions-vectors | 2023-09-04T11:57:31.000Z | [
"license:mit",
"region:us"
] | sudiptabasak | null | null | null | 0 | 0 | ---
license: mit
---
|
hassanMurad/sahiBukhari | 2023-09-03T10:28:42.000Z | [
"region:us"
] | hassanMurad | null | null | null | 0 | 0 | Entry not found |
miaoyh32/ipc2clc_812 | 2023-09-03T07:53:55.000Z | [
"region:us"
] | miaoyh32 | null | null | null | 0 | 0 | Entry not found |
Admin08077/Feature_Extractor | 2023-09-13T20:38:41.000Z | [
"license:other",
"region:us"
] | Admin08077 | null | null | null | 0 | 0 | ---
license: other
---
from enum import Enum
from uuid import UUID
from typing import Optional, Union, List
class HashType(Enum):
SHA256 = "SHA256"
class URL(Enum):
FILES = "/Files"
URL_FILES = "/files"
class File:
url: URL
file_name: str
hash_type: HashType
hash: str
size: int
def __init__(self, url: URL, file_name: str, hash_type: HashType, hash: str, size: int) -> None:
self.url = url
self.file_name = file_name
self.hash_type = hash_type
self.hash = hash
self.size = size
class FeatueExtractorConfiguration:
file: File
def __init__(self, file: File) -> None:
self.file = file
class FeatureInfoElement:
feature_id: str
min_weight: str
def __init__(self, feature_id: str, min_weight: str) -> None:
self.feature_id = feature_id
self.min_weight = min_weight
class FeatueExtractorFeatures:
feature_info: FeatureInfoElement
def __init__(self, feature_info: FeatureInfoElement) -> None:
self.feature_info = feature_info
class Level(Enum):
COLUMN_DATA = "ColumnData"
COLUMN_SCHEMA = "ColumnSchema"
class TypeEnum(Enum):
BLOOMIER_FILTER = "BloomierFilter"
CHECK_DIGIT_VALIDATOR = "CheckDigitValidator"
CREDIT_CARD = "CreditCard"
DICTIONARY = "Dictionary"
HYBRID = "Hybrid"
MIME = "Mime"
MODEL_METADATA = "ModelMetadata"
NUMERIC = "Numeric"
REG_EX = "RegEx"
class FeatueExtractor:
id: UUID
version: int
configuration: FeatueExtractorConfiguration
features: FeatueExtractorFeatures
type: TypeEnum
level: Level
def __init__(self, id: UUID, version: int, configuration: FeatueExtractorConfiguration, features: FeatueExtractorFeatures, type: TypeEnum, level: Level) -> None:
self.id = id
self.version = version
self.configuration = configuration
self.features = features
self.type = type
self.level = level
class DataClass:
cdata: str
def __init__(self, cdata: str) -> None:
self.cdata = cdata
class FeatureExtractorConfiguration:
file: Optional[File]
data: Union[DataClass, None, str]
text: Optional[str]
def __init__(self, file: Optional[File], data: Union[DataClass, None, str], text: Optional[str]) -> None:
self.file = file
self.data = data
self.text = text
class FeatureExtractorFeatures:
feature_info: Union[List[FeatureInfoElement], FeatureInfoElement]
def __init__(self, feature_info: Union[List[FeatureInfoElement], FeatureInfoElement]) -> None:
self.feature_info = feature_info
class FeatureExtractor:
id: str
version: int
configuration: FeatureExtractorConfiguration
features: FeatureExtractorFeatures
type: TypeEnum
level: Level
sub_feature_extractor: Optional[bool]
def __init__(self, id: str, version: int, configuration: FeatureExtractorConfiguration, features: FeatureExtractorFeatures, type: TypeEnum, level: Level, sub_feature_extractor: Optional[bool]) -> None:
self.id = id
self.version = version
self.configuration = configuration
self.features = features
self.type = type
self.level = level
self.sub_feature_extractor = sub_feature_extractor
class ArrayOfFeatureExtractor:
featue_extractor: FeatueExtractor
feature_extractor: List[FeatureExtractor]
xmlns: str
xmlns_xsi: str
xmlns_xsd: str
def __init__(self, featue_extractor: FeatueExtractor, feature_extractor: List[FeatureExtractor], xmlns: str, xmlns_xsi: str, xmlns_xsd: str) -> None:
self.featue_extractor = featue_extractor
self.feature_extractor = feature_extractor
self.xmlns = xmlns
self.xmlns_xsi = xmlns_xsi
self.xmlns_xsd = xmlns_xsd
class Welcome1:
array_of_feature_extractor: ArrayOfFeatureExtractor
def __init__(self, array_of_feature_extractor: ArrayOfFeatureExtractor) -> None:
self.array_of_feature_extractor = array_of_feature_extractor
|
syp1229/Y_frequency | 2023-09-03T08:34:04.000Z | [
"region:us"
] | syp1229 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 4106414213
num_examples: 5400
download_size: 2632190153
dataset_size: 4106414213
---
# Dataset Card for "Y_frequency"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MonishMeher/kiranbedi-new | 2023-09-03T08:46:45.000Z | [
"region:us"
] | MonishMeher | null | null | null | 0 | 0 | Entry not found |
tinhpx2911/wiki-vn-process | 2023-09-03T08:59:24.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: title
dtype: string
- name: categories
dtype: string
splits:
- name: train
num_bytes: 1373835675
num_examples: 419581
download_size: 722564655
dataset_size: 1373835675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wiki-vn-process"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RWKV__rwkv-4-7b-pile | 2023-09-03T08:58:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-7b-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-7b-pile](https://huggingface.co/RWKV/rwkv-4-7b-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-7b-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T08:57:35.818807](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-7b-pile/blob/main/results_2023-09-03T08%3A57%3A35.818807.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2554216796327559,\n\
\ \"acc_stderr\": 0.03158160101497281,\n \"acc_norm\": 0.25909677468992814,\n\
\ \"acc_norm_stderr\": 0.03158243643577924,\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.014179591496728339,\n \"mc2\": 0.3365298312185615,\n\
\ \"mc2_stderr\": 0.01322843815854612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35409556313993173,\n \"acc_stderr\": 0.01397545412275656,\n\
\ \"acc_norm\": 0.3967576791808874,\n \"acc_norm_stderr\": 0.014296513020180639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48894642501493724,\n\
\ \"acc_stderr\": 0.0049885619442773905,\n \"acc_norm\": 0.663114917347142,\n\
\ \"acc_norm_stderr\": 0.004716792874433194\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.0335567721631314,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.0335567721631314\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108632,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108632\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234106,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234106\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23225806451612904,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429914,\n \"\
acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429914\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722127992,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722127992\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.02820554503327773,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.02820554503327773\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.017208579357787548,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.017208579357787548\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258526,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.23628691983122363,\n \"acc_stderr\": 0.02765215314415926,\n \"\
acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.02765215314415926\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2645739910313901,\n\
\ \"acc_stderr\": 0.0296051032170383,\n \"acc_norm\": 0.2645739910313901,\n\
\ \"acc_norm_stderr\": 0.0296051032170383\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686927,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686927\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134117,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22569832402234638,\n\
\ \"acc_stderr\": 0.013981395058455063,\n \"acc_norm\": 0.22569832402234638,\n\
\ \"acc_norm_stderr\": 0.013981395058455063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22186495176848875,\n\
\ \"acc_stderr\": 0.023598858292863047,\n \"acc_norm\": 0.22186495176848875,\n\
\ \"acc_norm_stderr\": 0.023598858292863047\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902023,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902023\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n\
\ \"acc_stderr\": 0.01140544362099694,\n \"acc_norm\": 0.27509778357235987,\n\
\ \"acc_norm_stderr\": 0.01140544362099694\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.027576468622740512,\n\
\ \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.027576468622740512\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987866,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987866\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675578,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675578\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n\
\ \"acc_stderr\": 0.027686913588013003,\n \"acc_norm\": 0.1890547263681592,\n\
\ \"acc_norm_stderr\": 0.027686913588013003\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20685434516523868,\n\
\ \"mc1_stderr\": 0.014179591496728339,\n \"mc2\": 0.3365298312185615,\n\
\ \"mc2_stderr\": 0.01322843815854612\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-7b-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|arc:challenge|25_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hellaswag|10_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T08:57:35.818807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T08:57:35.818807.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T08:57:35.818807.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T08:57:35.818807.parquet'
- config_name: results
data_files:
- split: 2023_09_03T08_57_35.818807
path:
- results_2023-09-03T08:57:35.818807.parquet
- split: latest
path:
- results_2023-09-03T08:57:35.818807.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-7b-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-7b-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-7b-pile](https://huggingface.co/RWKV/rwkv-4-7b-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-7b-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T08:57:35.818807](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-7b-pile/blob/main/results_2023-09-03T08%3A57%3A35.818807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2554216796327559,
"acc_stderr": 0.03158160101497281,
"acc_norm": 0.25909677468992814,
"acc_norm_stderr": 0.03158243643577924,
"mc1": 0.20685434516523868,
"mc1_stderr": 0.014179591496728339,
"mc2": 0.3365298312185615,
"mc2_stderr": 0.01322843815854612
},
"harness|arc:challenge|25": {
"acc": 0.35409556313993173,
"acc_stderr": 0.01397545412275656,
"acc_norm": 0.3967576791808874,
"acc_norm_stderr": 0.014296513020180639
},
"harness|hellaswag|10": {
"acc": 0.48894642501493724,
"acc_stderr": 0.0049885619442773905,
"acc_norm": 0.663114917347142,
"acc_norm_stderr": 0.004716792874433194
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.0335567721631314,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.0335567721631314
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108632,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108632
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234106,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234106
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924318,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924318
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429914,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722127992,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722127992
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.02820554503327773,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.02820554503327773
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.033367670865679766,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.033367670865679766
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.017208579357787548,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.017208579357787548
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258526,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2645739910313901,
"acc_stderr": 0.0296051032170383,
"acc_norm": 0.2645739910313901,
"acc_norm_stderr": 0.0296051032170383
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686927,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686927
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134117,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22569832402234638,
"acc_stderr": 0.013981395058455063,
"acc_norm": 0.22569832402234638,
"acc_norm_stderr": 0.013981395058455063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22186495176848875,
"acc_stderr": 0.023598858292863047,
"acc_norm": 0.22186495176848875,
"acc_norm_stderr": 0.023598858292863047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902023,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902023
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27509778357235987,
"acc_stderr": 0.01140544362099694,
"acc_norm": 0.27509778357235987,
"acc_norm_stderr": 0.01140544362099694
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.027576468622740512,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.027576468622740512
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987866,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987866
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675578,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675578
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013003,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013003
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20685434516523868,
"mc1_stderr": 0.014179591496728339,
"mc2": 0.3365298312185615,
"mc2_stderr": 0.01322843815854612
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
huangyt/FINETUNE2_TEST | 2023-09-03T09:31:24.000Z | [
"region:us"
] | huangyt | null | null | null | 0 | 0 | # 📔 **DATASET**
| **Dataset** | Class | Number of Questions |
| ------- | ----------------------------------------------------------------- | ------------------------ |
| **Prm800k** | Reasoning 、 MATH | 6713 |
| **ScienceQA** | ScienceQA | 5177 |
| **SciBench** | ScienceQA | 695 |
| **ReClor** | Reasoning | 1624 |
| **TheoremQA** | Commonsense 、 MATH 、 ScienceQA | 800 |
| **OpenBookQA** | Text_Understanding 、 Reasoning 、 Commonsense 、 ScienceQA | 5957 |
| **ARB** | Reasoning 、 MATH 、 ScienceQA 、 Commonsense 、 Text_Understanding | 605 |
| **Openassistant-guanaco** | Commonsense 、 Text_Understanding 、 Reasoning | 802 |
# 📌 **PURPOSE**
To verify the usefulness of the COT type of dataset, and to understand whether this type can improve the reasoning component.
|
YoungPhlo/GPT4LLM-unnatural_instruction_standardized | 2023-09-03T09:34:52.000Z | [
"region:us"
] | YoungPhlo | null | null | null | 0 | 0 | Entry not found |
eyalshub/reddit-demo | 2023-09-03T09:46:12.000Z | [
"region:us"
] | eyalshub | null | null | null | 0 | 0 | Entry not found |
sngsfydy/Messidor2 | 2023-09-03T09:55:23.000Z | [
"region:us"
] | sngsfydy | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 2037815855.296
num_examples: 1744
download_size: 2542069212
dataset_size: 2037815855.296
---
# Dataset Card for "Messidor2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tinhpx2911/viwiki-processed | 2023-09-03T10:05:59.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: title
dtype: string
- name: categories
dtype: string
splits:
- name: train
num_bytes: 1373835675
num_examples: 419581
download_size: 722564655
dataset_size: 1373835675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "viwiki-processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KatMarie/basqueparl_text | 2023-09-03T10:00:42.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22103017
num_examples: 133599
download_size: 11392783
dataset_size: 22103017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "basqueparl_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RWKV__rwkv-4-3b-pile | 2023-09-03T10:12:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-3b-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-3b-pile](https://huggingface.co/RWKV/rwkv-4-3b-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-3b-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T10:11:24.206687](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-3b-pile/blob/main/results_2023-09-03T10%3A11%3A24.206687.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2513377700414879,\n\
\ \"acc_stderr\": 0.03144526214368921,\n \"acc_norm\": 0.25456033465116495,\n\
\ \"acc_norm_stderr\": 0.0314500028902646,\n \"mc1\": 0.19828641370869032,\n\
\ \"mc1_stderr\": 0.013957608783385556,\n \"mc2\": 0.32137022625465045,\n\
\ \"mc2_stderr\": 0.013141258133613963\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3250853242320819,\n \"acc_stderr\": 0.013688147309729117,\n\
\ \"acc_norm\": 0.36006825938566556,\n \"acc_norm_stderr\": 0.014027516814585183\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44144592710615416,\n\
\ \"acc_stderr\": 0.00495544756469405,\n \"acc_norm\": 0.5965943039235212,\n\
\ \"acc_norm_stderr\": 0.0048957821077864885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677084,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677084\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.02725726032249485,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.02725726032249485\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628834,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533485,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533485\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.02066059748502692,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.02066059748502692\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000673,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000673\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.032578473844367746,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.032578473844367746\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861493,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861493\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19444444444444445,\n \"acc_stderr\": 0.02699145450203672,\n \"\
acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.02699145450203672\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417593,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417593\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.015720838678445252,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.015720838678445252\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879905,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.19828641370869032,\n\
\ \"mc1_stderr\": 0.013957608783385556,\n \"mc2\": 0.32137022625465045,\n\
\ \"mc2_stderr\": 0.013141258133613963\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-3b-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:11:24.206687.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:11:24.206687.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:11:24.206687.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:11:24.206687.parquet'
- config_name: results
data_files:
- split: 2023_09_03T10_11_24.206687
path:
- results_2023-09-03T10:11:24.206687.parquet
- split: latest
path:
- results_2023-09-03T10:11:24.206687.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-3b-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-3b-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-3b-pile](https://huggingface.co/RWKV/rwkv-4-3b-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-3b-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T10:11:24.206687](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-3b-pile/blob/main/results_2023-09-03T10%3A11%3A24.206687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2513377700414879,
"acc_stderr": 0.03144526214368921,
"acc_norm": 0.25456033465116495,
"acc_norm_stderr": 0.0314500028902646,
"mc1": 0.19828641370869032,
"mc1_stderr": 0.013957608783385556,
"mc2": 0.32137022625465045,
"mc2_stderr": 0.013141258133613963
},
"harness|arc:challenge|25": {
"acc": 0.3250853242320819,
"acc_stderr": 0.013688147309729117,
"acc_norm": 0.36006825938566556,
"acc_norm_stderr": 0.014027516814585183
},
"harness|hellaswag|10": {
"acc": 0.44144592710615416,
"acc_stderr": 0.00495544756469405,
"acc_norm": 0.5965943039235212,
"acc_norm_stderr": 0.0048957821077864885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.02725726032249485,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.02725726032249485
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628834,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533485,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533485
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.02066059748502692,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.02066059748502692
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000673,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000673
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.032578473844367746,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.032578473844367746
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861493,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861493
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.02699145450203672,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.02699145450203672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417593,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417593
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.015720838678445252,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.015720838678445252
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.19828641370869032,
"mc1_stderr": 0.013957608783385556,
"mc2": 0.32137022625465045,
"mc2_stderr": 0.013141258133613963
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.