datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AlexFierro9/Kinetics400 | ---
license: cc-by-4.0
---
A collection of large-scale, high-quality datasets of URL links of up to 650,000 video clips that cover 400 human action classes.
The videos include human-object interactions such as playing instruments, as well as human-human interactions such as shaking hands and hugging.
Each action class has at least 400.
Each clip is human annotated with a single action class and lasts around 10 seconds.
Originally created by Google Inc. and the dataset has been uploaded as is without any changes. |
hdeldar/Persian-Text-llama2-9k | ---
dataset_info:
features:
- name: answers
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 17137852
num_examples: 9008
- name: validation
num_bytes: 1753700
num_examples: 930
download_size: 1329768
dataset_size: 18891552
---
# Dataset Card for "Persian-Text-llama2-9k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0 | ---
pretty_name: Evaluation run of abhishek/hepu-o4zf-ravz-7-0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishek/hepu-o4zf-ravz-7-0](https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T15:35:44.454119](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0/blob/main/results_2023-12-13T15-35-44.454119.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23314538989180242,\n\
\ \"acc_stderr\": 0.029997889374526535,\n \"acc_norm\": 0.23329981822026483,\n\
\ \"acc_norm_stderr\": 0.03078927812586186,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156487,\n \"mc2\": 0.5166805857294422,\n\
\ \"mc2_stderr\": 0.016293087426390157\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20477815699658702,\n \"acc_stderr\": 0.011792544338513412,\n\
\ \"acc_norm\": 0.24488054607508533,\n \"acc_norm_stderr\": 0.012566273985131358\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2584146584345748,\n\
\ \"acc_stderr\": 0.004368684255626194,\n \"acc_norm\": 0.25363473411670984,\n\
\ \"acc_norm_stderr\": 0.004342017709967956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514203,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514203\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21693121693121692,\n \"acc_stderr\": 0.021227082449445073,\n \"\
acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.021227082449445073\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.17096774193548386,\n \"acc_stderr\": 0.02141724293632157,\n \"\
acc_norm\": 0.17096774193548386,\n \"acc_norm_stderr\": 0.02141724293632157\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.1477832512315271,\n \"acc_stderr\": 0.024969621333521274,\n \"\
acc_norm\": 0.1477832512315271,\n \"acc_norm_stderr\": 0.024969621333521274\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180361,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180361\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n\
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294285,\n \"\
acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.017149858514250937,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.017149858514250937\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.14814814814814814,\n \"acc_stderr\": 0.02422762927372836,\n\
\ \"acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.02422762927372836\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\
\ \"acc_stderr\": 0.015104550008905702,\n \"acc_norm\": 0.23243933588761176,\n\
\ \"acc_norm_stderr\": 0.015104550008905702\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.02405102973991225,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.02405102973991225\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201247,\n\
\ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201247\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n\
\ \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n\
\ \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n\
\ \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156487,\n\
\ \"mc2\": 0.5166805857294422,\n \"mc2_stderr\": 0.016293087426390157\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.4925019731649566,\n\
\ \"acc_stderr\": 0.014050905521228573\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|arc:challenge|25_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|gsm8k|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hellaswag|10_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T15-35-44.454119.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- '**/details_harness|winogrande|5_2023-12-13T15-35-44.454119.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T15-35-44.454119.parquet'
- config_name: results
data_files:
- split: 2023_12_13T15_35_44.454119
path:
- results_2023-12-13T15-35-44.454119.parquet
- split: latest
path:
- results_2023-12-13T15-35-44.454119.parquet
---
# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/hepu-o4zf-ravz-7-0](https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T15:35:44.454119](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0/blob/main/results_2023-12-13T15-35-44.454119.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23314538989180242,
"acc_stderr": 0.029997889374526535,
"acc_norm": 0.23329981822026483,
"acc_norm_stderr": 0.03078927812586186,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156487,
"mc2": 0.5166805857294422,
"mc2_stderr": 0.016293087426390157
},
"harness|arc:challenge|25": {
"acc": 0.20477815699658702,
"acc_stderr": 0.011792544338513412,
"acc_norm": 0.24488054607508533,
"acc_norm_stderr": 0.012566273985131358
},
"harness|hellaswag|10": {
"acc": 0.2584146584345748,
"acc_stderr": 0.004368684255626194,
"acc_norm": 0.25363473411670984,
"acc_norm_stderr": 0.004342017709967956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514203,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514203
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.021227082449445073,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.021227082449445073
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.17096774193548386,
"acc_stderr": 0.02141724293632157,
"acc_norm": 0.17096774193548386,
"acc_norm_stderr": 0.02141724293632157
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1477832512315271,
"acc_stderr": 0.024969621333521274,
"acc_norm": 0.1477832512315271,
"acc_norm_stderr": 0.024969621333521274
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180361,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180361
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.03173284384294285,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.03173284384294285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.017149858514250937,
"acc_norm": 0.2,
"acc_norm_stderr": 0.017149858514250937
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.14814814814814814,
"acc_stderr": 0.02422762927372836,
"acc_norm": 0.14814814814814814,
"acc_norm_stderr": 0.02422762927372836
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905702,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905702
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.026040662474201247,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.026040662474201247
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156487,
"mc2": 0.5166805857294422,
"mc2_stderr": 0.016293087426390157
},
"harness|winogrande|5": {
"acc": 0.4925019731649566,
"acc_stderr": 0.014050905521228573
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/marseillaise_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane)
This is the dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, large_breasts, bangs, white_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 52.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 21.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 43.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 83.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marseillaise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, blush, cleavage, detached_sleeves, looking_at_viewer, white_dress, black_thighhighs, navel, black_gloves, closed_mouth, hair_ornament, thighs, horns, smile, cowboy_shot, long_sleeves, panties, simple_background, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, black_pants, looking_at_viewer, solo, sports_bra, yoga_pants, ass, bare_shoulders, blush, no_shoes, sweat, sitting, closed_mouth, grey_hair, looking_back, white_socks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | cleavage | detached_sleeves | looking_at_viewer | white_dress | black_thighhighs | navel | black_gloves | closed_mouth | hair_ornament | thighs | horns | smile | cowboy_shot | long_sleeves | panties | simple_background | white_background | black_pants | sports_bra | yoga_pants | ass | bare_shoulders | no_shoes | sweat | sitting | grey_hair | looking_back | white_socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:-------------------|:--------------------|:--------------|:-------------------|:--------|:---------------|:---------------|:----------------|:---------|:--------|:--------|:--------------|:---------------|:----------|:--------------------|:-------------------|:--------------|:-------------|:-------------|:------|:-----------------|:-----------|:--------|:----------|:------------|:---------------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
kaleemWaheed/twitter_dataset_1713134790 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23978
num_examples: 57
download_size: 13683
dataset_size: 23978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Syed-Hasan-8503__openhermes-gemma-2b-it | ---
pretty_name: Evaluation run of Syed-Hasan-8503/openhermes-gemma-2b-it
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Syed-Hasan-8503/openhermes-gemma-2b-it](https://huggingface.co/Syed-Hasan-8503/openhermes-gemma-2b-it)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Syed-Hasan-8503__openhermes-gemma-2b-it\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T14:39:27.607922](https://huggingface.co/datasets/open-llm-leaderboard/details_Syed-Hasan-8503__openhermes-gemma-2b-it/blob/main/results_2024-02-22T14-39-27.607922.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37696524742191334,\n\
\ \"acc_stderr\": 0.03381316358729798,\n \"acc_norm\": 0.3815378335823341,\n\
\ \"acc_norm_stderr\": 0.03461953317836164,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.458323806326475,\n\
\ \"mc2_stderr\": 0.015931044127458407\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436172,\n\
\ \"acc_norm\": 0.439419795221843,\n \"acc_norm_stderr\": 0.01450374782358013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4810794662417845,\n\
\ \"acc_stderr\": 0.00498620758186293,\n \"acc_norm\": 0.627365066719777,\n\
\ \"acc_norm_stderr\": 0.004825179407757572\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983042,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983042\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761923,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761923\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.46060606060606063,\n \"acc_stderr\": 0.03892207016552012,\n\
\ \"acc_norm\": 0.46060606060606063,\n \"acc_norm_stderr\": 0.03892207016552012\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.024388430433987664,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.024388430433987664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5100917431192661,\n \"acc_stderr\": 0.021432956203453316,\n \"\
acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.021432956203453316\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380758,\n \"\
acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380758\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.46360153256704983,\n\
\ \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.46360153256704983,\n\
\ \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.01455155365936992,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.01455155365936992\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02849199358617157,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02849199358617157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40192926045016075,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.40192926045016075,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.027339546640662727,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.027339546640662727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022135,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022135\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31877444589308995,\n\
\ \"acc_stderr\": 0.0119018956357861,\n \"acc_norm\": 0.31877444589308995,\n\
\ \"acc_norm_stderr\": 0.0119018956357861\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.380718954248366,\n \"acc_stderr\": 0.019643801557924806,\n \
\ \"acc_norm\": 0.380718954248366,\n \"acc_norm_stderr\": 0.019643801557924806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673281,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.458323806326475,\n\
\ \"mc2_stderr\": 0.015931044127458407\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6093133385951065,\n \"acc_stderr\": 0.01371253603655665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.056103108415466264,\n \
\ \"acc_stderr\": 0.006338668431321893\n }\n}\n```"
repo_url: https://huggingface.co/Syed-Hasan-8503/openhermes-gemma-2b-it
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-39-27.607922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-39-27.607922.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- '**/details_harness|winogrande|5_2024-02-22T14-39-27.607922.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T14-39-27.607922.parquet'
- config_name: results
data_files:
- split: 2024_02_22T14_39_27.607922
path:
- results_2024-02-22T14-39-27.607922.parquet
- split: latest
path:
- results_2024-02-22T14-39-27.607922.parquet
---
# Dataset Card for Evaluation run of Syed-Hasan-8503/openhermes-gemma-2b-it
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Syed-Hasan-8503/openhermes-gemma-2b-it](https://huggingface.co/Syed-Hasan-8503/openhermes-gemma-2b-it) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Syed-Hasan-8503__openhermes-gemma-2b-it",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T14:39:27.607922](https://huggingface.co/datasets/open-llm-leaderboard/details_Syed-Hasan-8503__openhermes-gemma-2b-it/blob/main/results_2024-02-22T14-39-27.607922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37696524742191334,
"acc_stderr": 0.03381316358729798,
"acc_norm": 0.3815378335823341,
"acc_norm_stderr": 0.03461953317836164,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.458323806326475,
"mc2_stderr": 0.015931044127458407
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436172,
"acc_norm": 0.439419795221843,
"acc_norm_stderr": 0.01450374782358013
},
"harness|hellaswag|10": {
"acc": 0.4810794662417845,
"acc_stderr": 0.00498620758186293,
"acc_norm": 0.627365066719777,
"acc_norm_stderr": 0.004825179407757572
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983042,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983042
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761923,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761923
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.46060606060606063,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.46060606060606063,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.024388430433987664,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024388430433987664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.021432956203453316,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.021432956203453316
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057986,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057986
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380758,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380758
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.46360153256704983,
"acc_stderr": 0.01783252407959326,
"acc_norm": 0.46360153256704983,
"acc_norm_stderr": 0.01783252407959326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.407514450867052,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.407514450867052,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.01455155365936992,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.01455155365936992
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617157,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40192926045016075,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.40192926045016075,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.027339546640662727,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.027339546640662727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022135,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022135
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31877444589308995,
"acc_stderr": 0.0119018956357861,
"acc_norm": 0.31877444589308995,
"acc_norm_stderr": 0.0119018956357861
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.380718954248366,
"acc_stderr": 0.019643801557924806,
"acc_norm": 0.380718954248366,
"acc_norm_stderr": 0.019643801557924806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673281,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.458323806326475,
"mc2_stderr": 0.015931044127458407
},
"harness|winogrande|5": {
"acc": 0.6093133385951065,
"acc_stderr": 0.01371253603655665
},
"harness|gsm8k|5": {
"acc": 0.056103108415466264,
"acc_stderr": 0.006338668431321893
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
karpathy/tiny_shakespeare | ---
paperswithcode_id: null
pretty_name: TinyShakespeare
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 55780
num_examples: 1
- name: train
num_bytes: 1003864
num_examples: 1
- name: validation
num_bytes: 55780
num_examples: 1
download_size: 1115394
dataset_size: 1115424
---
# Dataset Card for "tiny_shakespeare"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/karpathy/char-rnn/blob/master/data/tinyshakespeare/input.txt](https://github.com/karpathy/char-rnn/blob/master/data/tinyshakespeare/input.txt)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.11 MB
- **Size of the generated dataset:** 1.11 MB
- **Total amount of disk used:** 2.23 MB
### Dataset Summary
40,000 lines of Shakespeare from a variety of Shakespeare's plays. Featured in Andrej Karpathy's blog post 'The Unreasonable Effectiveness of Recurrent Neural Networks': http://karpathy.github.io/2015/05/21/rnn-effectiveness/.
To use for e.g. character modelling:
```
d = datasets.load_dataset(name='tiny_shakespeare')['train']
d = d.map(lambda x: datasets.Value('strings').unicode_split(x['text'], 'UTF-8'))
# train split includes vocabulary for other splits
vocabulary = sorted(set(next(iter(d)).numpy()))
d = d.map(lambda x: {'cur_char': x[:-1], 'next_char': x[1:]})
d = d.unbatch()
seq_len = 100
batch_size = 2
d = d.batch(seq_len)
d = d.batch(batch_size)
```
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.11 MB
- **Size of the generated dataset:** 1.11 MB
- **Total amount of disk used:** 2.23 MB
An example of 'train' looks as follows.
```
{
"text": "First Citizen:\nBefore we proceed any further, hear me "
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `text`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default| 1| 1| 1|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@misc{
author={Karpathy, Andrej},
title={char-rnn},
year={2015},
howpublished={\url{https://github.com/karpathy/char-rnn}}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
jmartin233/reading_comprehension_exercise_dataset_v2 | ---
dataset_info:
features:
- name: person
dtype: string
- name: location
dtype: string
- name: grammar
dtype: string
- name: level
dtype: string
- name: passage
dtype: string
splits:
- name: train
num_bytes: 104862
num_examples: 171
download_size: 53842
dataset_size: 104862
---
# Dataset Card for "reading_comprehension_exercise_dataset_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbilcke-hf/ai-tube-tensar-studios | ---
license: cc-by-nc-sa-4.0
pretty_name: Tensar Studios
---
## Description
Follow us to experience a new era of 3D AI-animated short films.
## Model
SVD
## LoRA
artificialguybr/3DRedmond-V1
## Style
- 3D Render Style
- 3DRenderAF
## Voice
Cloée
## Music
Background cinematic music
## Prompt
A video channel which produces short 3D films of fictional franchises.
Stories should be short, about 1 minute or 2, but full of action and fun.
It will NEVER create content from existing artists or studio.
Instead it will create its own artistic stories, content, stories and characters.
Characters should not be biased towards any specific gender, work, country, religion or culture.
Some themes should be avoided in the stories, such as violence, sex, war, crimes etc.
No nudity is tolerated.
It should use animal characters whenever possible, but also human characters.
Characters should be very cartoony / cute.
|
taesiri/imagenet-hard-4K | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: label
sequence: int64
- name: origin
dtype: string
- name: english_label
sequence: string
splits:
- name: validation
num_bytes: 70959420455.86
num_examples: 10980
download_size: 66129324319
dataset_size: 70959420455.86
license: mit
task_categories:
- image-classification
language:
- en
tags:
- OOD
- ImageNet
- Out Of Distribution
pretty_name: ImageNet-Hard-4K
size_categories:
- 10K<n<100K
---
# Dataset Card for "Imagenet-Hard-4K"
[Project Page](https://taesiri.github.io/ZoomIsAllYouNeed/) - [Paper](https://arxiv.org/abs/2304.05538) - [Github](https://github.com/taesiri/ZoomIsAllYouNeed)
**ImageNet-Hard-4K** is 4K version of the original [**ImageNet-Hard**](https://huggingface.co/datasets/taesiri/imagenet-hard) dataset, which is a new benchmark that comprises 10,980 images collected from various existing ImageNet-scale benchmarks (ImageNet, ImageNet-V2, ImageNet-Sketch, ImageNet-C, ImageNet-R, ImageNet-ReaL, ImageNet-A, and ObjectNet). This dataset poses a significant challenge to state-of-the-art vision models as merely zooming in often fails to improve their ability to classify images correctly. As a result, even the most advanced models, such as `CLIP-ViT-L/14@336px`, struggle to perform well on this dataset, achieving a mere `2.02%` accuracy.
## Upscaling Procedure
We employed [GigaGAN](https://mingukkang.github.io/GigaGAN/) to upscale each image from the original ImageNet-Hard dataset to a resolution of 4K.
### Dataset Distribution

### Classifiers Performance
| Model | Accuracy |
| ------------------- | -------- |
| AlexNet | 7.08 |
| VGG-16 | 11.32 |
| ResNet-18 | 10.42 |
| ResNet-50 | 13.93 |
| ViT-B/32 | 18.12 |
| EfficientNet-B0 | 12.94 |
| EfficientNet-B7 | 18.67 |
| EfficientNet-L2-Ns | 28.42 |
| CLIP-ViT-L/14@224px | 1.81 |
| CLIP-ViT-L/14@336px | 1.88 |
| OpenCLIP-ViT-bigG-14| 14.33 |
| OpenCLIP-ViT-L-14 | 13.04 |
**Evaluation Code**
* CLIP <a target="_blank" href="https://colab.research.google.com/github/taesiri/ZoomIsAllYouNeed/blob/main/src/ImageNet_Hard/Prompt_Engineering_for_ImageNet_Hard.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a>
* Other models <a target="_blank" href="https://colab.research.google.com/github/taesiri/ZoomIsAllYouNeed/blob/main/src/ImageNet_Hard/Benchmark_ImageNet_Hard.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a>
## Supported Tasks
- `image-classification`: The objective of this task is to classify an image into one or more classes, selected from 1000 ImageNet categories (allowing for multiple ground-truth labels per image).
## Languages
The `english_label` field in the dataset are in English.
## Dataset Structure
Data Instances
An example looks like this:
```python
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=575x409 at 0x7F09456B53A0>,
'label': [0],
'origin': 'imagenet_sketch',
'english_label': ['tench']
}
```
### Data Fields
The data instances have the following fields:
- image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0]["image"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- label: A List[int] collection containing the ground-truth ids.
- origin: A string containing source dataset.
- english_label: A List[str] collection containg the english labels for the ground-truth classes.
<details>
<summary>
Click here to see the full list of ImageNet class labels mapping:
</summary>
|id|Class|
|--|-----|
|0 | tench, Tinca tinca|
|1 | goldfish, Carassius auratus|
|2 | great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias|
|3 | tiger shark, Galeocerdo cuvieri|
|4 | hammerhead, hammerhead shark|
|5 | electric ray, crampfish, numbfish, torpedo|
|6 | stingray|
|7 | cock|
|8 | hen|
|9 | ostrich, Struthio camelus|
|10 | brambling, Fringilla montifringilla|
|11 | goldfinch, Carduelis carduelis|
|12 | house finch, linnet, Carpodacus mexicanus|
|13 | junco, snowbird|
|14 | indigo bunting, indigo finch, indigo bird, Passerina cyanea|
|15 | robin, American robin, Turdus migratorius|
|16 | bulbul|
|17 | jay|
|18 | magpie|
|19 | chickadee|
|20 | water ouzel, dipper|
|21 | kite|
|22 | bald eagle, American eagle, Haliaeetus leucocephalus|
|23 | vulture|
|24 | great grey owl, great gray owl, Strix nebulosa|
|25 | European fire salamander, Salamandra salamandra|
|26 | common newt, Triturus vulgaris|
|27 | eft|
|28 | spotted salamander, Ambystoma maculatum|
|29 | axolotl, mud puppy, Ambystoma mexicanum|
|30 | bullfrog, Rana catesbeiana|
|31 | tree frog, tree-frog|
|32 | tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui|
|33 | loggerhead, loggerhead turtle, Caretta caretta|
|34 | leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea|
|35 | mud turtle|
|36 | terrapin|
|37 | box turtle, box tortoise|
|38 | banded gecko|
|39 | common iguana, iguana, Iguana iguana|
|40 | American chameleon, anole, Anolis carolinensis|
|41 | whiptail, whiptail lizard|
|42 | agama|
|43 | frilled lizard, Chlamydosaurus kingi|
|44 | alligator lizard|
|45 | Gila monster, Heloderma suspectum|
|46 | green lizard, Lacerta viridis|
|47 | African chameleon, Chamaeleo chamaeleon|
|48 | Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis|
|49 | African crocodile, Nile crocodile, Crocodylus niloticus|
|50 | American alligator, Alligator mississipiensis|
|51 | triceratops|
|52 | thunder snake, worm snake, Carphophis amoenus|
|53 | ringneck snake, ring-necked snake, ring snake|
|54 | hognose snake, puff adder, sand viper|
|55 | green snake, grass snake|
|56 | king snake, kingsnake|
|57 | garter snake, grass snake|
|58 | water snake|
|59 | vine snake|
|60 | night snake, Hypsiglena torquata|
|61 | boa constrictor, Constrictor constrictor|
|62 | rock python, rock snake, Python sebae|
|63 | Indian cobra, Naja naja|
|64 | green mamba|
|65 | sea snake|
|66 | horned viper, cerastes, sand viper, horned asp, Cerastes cornutus|
|67 | diamondback, diamondback rattlesnake, Crotalus adamanteus|
|68 | sidewinder, horned rattlesnake, Crotalus cerastes|
|69 | trilobite|
|70 | harvestman, daddy longlegs, Phalangium opilio|
|71 | scorpion|
|72 | black and gold garden spider, Argiope aurantia|
|73 | barn spider, Araneus cavaticus|
|74 | garden spider, Aranea diademata|
|75 | black widow, Latrodectus mactans|
|76 | tarantula|
|77 | wolf spider, hunting spider|
|78 | tick|
|79 | centipede|
|80 | black grouse|
|81 | ptarmigan|
|82 | ruffed grouse, partridge, Bonasa umbellus|
|83 | prairie chicken, prairie grouse, prairie fowl|
|84 | peacock|
|85 | quail|
|86 | partridge|
|87 | African grey, African gray, Psittacus erithacus|
|88 | macaw|
|89 | sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita|
|90 | lorikeet|
|91 | coucal|
|92 | bee eater|
|93 | hornbill|
|94 | hummingbird|
|95 | jacamar|
|96 | toucan|
|97 | drake|
|98 | red-breasted merganser, Mergus serrator|
|99 | goose|
|100 | black swan, Cygnus atratus|
|101 | tusker|
|102 | echidna, spiny anteater, anteater|
|103 | platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus|
|104 | wallaby, brush kangaroo|
|105 | koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus|
|106 | wombat|
|107 | jellyfish|
|108 | sea anemone, anemone|
|109 | brain coral|
|110 | flatworm, platyhelminth|
|111 | nematode, nematode worm, roundworm|
|112 | conch|
|113 | snail|
|114 | slug|
|115 | sea slug, nudibranch|
|116 | chiton, coat-of-mail shell, sea cradle, polyplacophore|
|117 | chambered nautilus, pearly nautilus, nautilus|
|118 | Dungeness crab, Cancer magister|
|119 | rock crab, Cancer irroratus|
|120 | fiddler crab|
|121 | king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica|
|122 | American lobster, Northern lobster, Maine lobster, Homarus americanus|
|123 | spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish|
|124 | crayfish, crawfish, crawdad, crawdaddy|
|125 | hermit crab|
|126 | isopod|
|127 | white stork, Ciconia ciconia|
|128 | black stork, Ciconia nigra|
|129 | spoonbill|
|130 | flamingo|
|131 | little blue heron, Egretta caerulea|
|132 | American egret, great white heron, Egretta albus|
|133 | bittern|
|134 | crane|
|135 | limpkin, Aramus pictus|
|136 | European gallinule, Porphyrio porphyrio|
|137 | American coot, marsh hen, mud hen, water hen, Fulica americana|
|138 | bustard|
|139 | ruddy turnstone, Arenaria interpres|
|140 | red-backed sandpiper, dunlin, Erolia alpina|
|141 | redshank, Tringa totanus|
|142 | dowitcher|
|143 | oystercatcher, oyster catcher|
|144 | pelican|
|145 | king penguin, Aptenodytes patagonica|
|146 | albatross, mollymawk|
|147 | grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus|
|148 | killer whale, killer, orca, grampus, sea wolf, Orcinus orca|
|149 | dugong, Dugong dugon|
|150 | sea lion|
|151 | Chihuahua|
|152 | Japanese spaniel|
|153 | Maltese dog, Maltese terrier, Maltese|
|154 | Pekinese, Pekingese, Peke|
|155 | Shih-Tzu|
|156 | Blenheim spaniel|
|157 | papillon|
|158 | toy terrier|
|159 | Rhodesian ridgeback|
|160 | Afghan hound, Afghan|
|161 | basset, basset hound|
|162 | beagle|
|163 | bloodhound, sleuthhound|
|164 | bluetick|
|165 | black-and-tan coonhound|
|166 | Walker hound, Walker foxhound|
|167 | English foxhound|
|168 | redbone|
|169 | borzoi, Russian wolfhound|
|170 | Irish wolfhound|
|171 | Italian greyhound|
|172 | whippet|
|173 | Ibizan hound, Ibizan Podenco|
|174 | Norwegian elkhound, elkhound|
|175 | otterhound, otter hound|
|176 | Saluki, gazelle hound|
|177 | Scottish deerhound, deerhound|
|178 | Weimaraner|
|179 | Staffordshire bullterrier, Staffordshire bull terrier|
|180 | American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier|
|181 | Bedlington terrier|
|182 | Border terrier|
|183 | Kerry blue terrier|
|184 | Irish terrier|
|185 | Norfolk terrier|
|186 | Norwich terrier|
|187 | Yorkshire terrier|
|188 | wire-haired fox terrier|
|189 | Lakeland terrier|
|190 | Sealyham terrier, Sealyham|
|191 | Airedale, Airedale terrier|
|192 | cairn, cairn terrier|
|193 | Australian terrier|
|194 | Dandie Dinmont, Dandie Dinmont terrier|
|195 | Boston bull, Boston terrier|
|196 | miniature schnauzer|
|197 | giant schnauzer|
|198 | standard schnauzer|
|199 | Scotch terrier, Scottish terrier, Scottie|
|200 | Tibetan terrier, chrysanthemum dog|
|201 | silky terrier, Sydney silky|
|202 | soft-coated wheaten terrier|
|203 | West Highland white terrier|
|204 | Lhasa, Lhasa apso|
|205 | flat-coated retriever|
|206 | curly-coated retriever|
|207 | golden retriever|
|208 | Labrador retriever|
|209 | Chesapeake Bay retriever|
|210 | German short-haired pointer|
|211 | vizsla, Hungarian pointer|
|212 | English setter|
|213 | Irish setter, red setter|
|214 | Gordon setter|
|215 | Brittany spaniel|
|216 | clumber, clumber spaniel|
|217 | English springer, English springer spaniel|
|218 | Welsh springer spaniel|
|219 | cocker spaniel, English cocker spaniel, cocker|
|220 | Sussex spaniel|
|221 | Irish water spaniel|
|222 | kuvasz|
|223 | schipperke|
|224 | groenendael|
|225 | malinois|
|226 | briard|
|227 | kelpie|
|228 | komondor|
|229 | Old English sheepdog, bobtail|
|230 | Shetland sheepdog, Shetland sheep dog, Shetland|
|231 | collie|
|232 | Border collie|
|233 | Bouvier des Flandres, Bouviers des Flandres|
|234 | Rottweiler|
|235 | German shepherd, German shepherd dog, German police dog, alsatian|
|236 | Doberman, Doberman pinscher|
|237 | miniature pinscher|
|238 | Greater Swiss Mountain dog|
|239 | Bernese mountain dog|
|240 | Appenzeller|
|241 | EntleBucher|
|242 | boxer|
|243 | bull mastiff|
|244 | Tibetan mastiff|
|245 | French bulldog|
|246 | Great Dane|
|247 | Saint Bernard, St Bernard|
|248 | Eskimo dog, husky|
|249 | malamute, malemute, Alaskan malamute|
|250 | Siberian husky|
|251 | dalmatian, coach dog, carriage dog|
|252 | affenpinscher, monkey pinscher, monkey dog|
|253 | basenji|
|254 | pug, pug-dog|
|255 | Leonberg|
|256 | Newfoundland, Newfoundland dog|
|257 | Great Pyrenees|
|258 | Samoyed, Samoyede|
|259 | Pomeranian|
|260 | chow, chow chow|
|261 | keeshond|
|262 | Brabancon griffon|
|263 | Pembroke, Pembroke Welsh corgi|
|264 | Cardigan, Cardigan Welsh corgi|
|265 | toy poodle|
|266 | miniature poodle|
|267 | standard poodle|
|268 | Mexican hairless|
|269 | timber wolf, grey wolf, gray wolf, Canis lupus|
|270 | white wolf, Arctic wolf, Canis lupus tundrarum|
|271 | red wolf, maned wolf, Canis rufus, Canis niger|
|272 | coyote, prairie wolf, brush wolf, Canis latrans|
|273 | dingo, warrigal, warragal, Canis dingo|
|274 | dhole, Cuon alpinus|
|275 | African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus|
|276 | hyena, hyaena|
|277 | red fox, Vulpes vulpes|
|278 | kit fox, Vulpes macrotis|
|279 | Arctic fox, white fox, Alopex lagopus|
|280 | grey fox, gray fox, Urocyon cinereoargenteus|
|281 | tabby, tabby cat|
|282 | tiger cat|
|283 | Persian cat|
|284 | Siamese cat, Siamese|
|285 | Egyptian cat|
|286 | cougar, puma, catamount, mountain lion, painter, panther, Felis concolor|
|287 | lynx, catamount|
|288 | leopard, Panthera pardus|
|289 | snow leopard, ounce, Panthera uncia|
|290 | jaguar, panther, Panthera onca, Felis onca|
|291 | lion, king of beasts, Panthera leo|
|292 | tiger, Panthera tigris|
|293 | cheetah, chetah, Acinonyx jubatus|
|294 | brown bear, bruin, Ursus arctos|
|295 | American black bear, black bear, Ursus americanus, Euarctos americanus|
|296 | ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus|
|297 | sloth bear, Melursus ursinus, Ursus ursinus|
|298 | mongoose|
|299 | meerkat, mierkat|
|300 | tiger beetle|
|301 | ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle|
|302 | ground beetle, carabid beetle|
|303 | long-horned beetle, longicorn, longicorn beetle|
|304 | leaf beetle, chrysomelid|
|305 | dung beetle|
|306 | rhinoceros beetle|
|307 | weevil|
|308 | fly|
|309 | bee|
|310 | ant, emmet, pismire|
|311 | grasshopper, hopper|
|312 | cricket|
|313 | walking stick, walkingstick, stick insect|
|314 | cockroach, roach|
|315 | mantis, mantid|
|316 | cicada, cicala|
|317 | leafhopper|
|318 | lacewing, lacewing fly|
|319 | dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk|
|320 | damselfly|
|321 | admiral|
|322 | ringlet, ringlet butterfly|
|323 | monarch, monarch butterfly, milkweed butterfly, Danaus plexippus|
|324 | cabbage butterfly|
|325 | sulphur butterfly, sulfur butterfly|
|326 | lycaenid, lycaenid butterfly|
|327 | starfish, sea star|
|328 | sea urchin|
|329 | sea cucumber, holothurian|
|330 | wood rabbit, cottontail, cottontail rabbit|
|331 | hare|
|332 | Angora, Angora rabbit|
|333 | hamster|
|334 | porcupine, hedgehog|
|335 | fox squirrel, eastern fox squirrel, Sciurus niger|
|336 | marmot|
|337 | beaver|
|338 | guinea pig, Cavia cobaya|
|339 | sorrel|
|340 | zebra|
|341 | hog, pig, grunter, squealer, Sus scrofa|
|342 | wild boar, boar, Sus scrofa|
|343 | warthog|
|344 | hippopotamus, hippo, river horse, Hippopotamus amphibius|
|345 | ox|
|346 | water buffalo, water ox, Asiatic buffalo, Bubalus bubalis|
|347 | bison|
|348 | ram, tup|
|349 | bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis|
|350 | ibex, Capra ibex|
|351 | hartebeest|
|352 | impala, Aepyceros melampus|
|353 | gazelle|
|354 | Arabian camel, dromedary, Camelus dromedarius|
|355 | llama|
|356 | weasel|
|357 | mink|
|358 | polecat, fitch, foulmart, foumart, Mustela putorius|
|359 | black-footed ferret, ferret, Mustela nigripes|
|360 | otter|
|361 | skunk, polecat, wood pussy|
|362 | badger|
|363 | armadillo|
|364 | three-toed sloth, ai, Bradypus tridactylus|
|365 | orangutan, orang, orangutang, Pongo pygmaeus|
|366 | gorilla, Gorilla gorilla|
|367 | chimpanzee, chimp, Pan troglodytes|
|368 | gibbon, Hylobates lar|
|369 | siamang, Hylobates syndactylus, Symphalangus syndactylus|
|370 | guenon, guenon monkey|
|371 | patas, hussar monkey, Erythrocebus patas|
|372 | baboon|
|373 | macaque|
|374 | langur|
|375 | colobus, colobus monkey|
|376 | proboscis monkey, Nasalis larvatus|
|377 | marmoset|
|378 | capuchin, ringtail, Cebus capucinus|
|379 | howler monkey, howler|
|380 | titi, titi monkey|
|381 | spider monkey, Ateles geoffroyi|
|382 | squirrel monkey, Saimiri sciureus|
|383 | Madagascar cat, ring-tailed lemur, Lemur catta|
|384 | indri, indris, Indri indri, Indri brevicaudatus|
|385 | Indian elephant, Elephas maximus|
|386 | African elephant, Loxodonta africana|
|387 | lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens|
|388 | giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca|
|389 | barracouta, snoek|
|390 | eel|
|391 | coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch|
|392 | rock beauty, Holocanthus tricolor|
|393 | anemone fish|
|394 | sturgeon|
|395 | gar, garfish, garpike, billfish, Lepisosteus osseus|
|396 | lionfish|
|397 | puffer, pufferfish, blowfish, globefish|
|398 | abacus|
|399 | abaya|
|400 | academic gown, academic robe, judge's robe|
|401 | accordion, piano accordion, squeeze box|
|402 | acoustic guitar|
|403 | aircraft carrier, carrier, flattop, attack aircraft carrier|
|404 | airliner|
|405 | airship, dirigible|
|406 | altar|
|407 | ambulance|
|408 | amphibian, amphibious vehicle|
|409 | analog clock|
|410 | apiary, bee house|
|411 | apron|
|412 | ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin|
|413 | assault rifle, assault gun|
|414 | backpack, back pack, knapsack, packsack, rucksack, haversack|
|415 | bakery, bakeshop, bakehouse|
|416 | balance beam, beam|
|417 | balloon|
|418 | ballpoint, ballpoint pen, ballpen, Biro|
|419 | Band Aid|
|420 | banjo|
|421 | bannister, banister, balustrade, balusters, handrail|
|422 | barbell|
|423 | barber chair|
|424 | barbershop|
|425 | barn|
|426 | barometer|
|427 | barrel, cask|
|428 | barrow, garden cart, lawn cart, wheelbarrow|
|429 | baseball|
|430 | basketball|
|431 | bassinet|
|432 | bassoon|
|433 | bathing cap, swimming cap|
|434 | bath towel|
|435 | bathtub, bathing tub, bath, tub|
|436 | beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon|
|437 | beacon, lighthouse, beacon light, pharos|
|438 | beaker|
|439 | bearskin, busby, shako|
|440 | beer bottle|
|441 | beer glass|
|442 | bell cote, bell cot|
|443 | bib|
|444 | bicycle-built-for-two, tandem bicycle, tandem|
|445 | bikini, two-piece|
|446 | binder, ring-binder|
|447 | binoculars, field glasses, opera glasses|
|448 | birdhouse|
|449 | boathouse|
|450 | bobsled, bobsleigh, bob|
|451 | bolo tie, bolo, bola tie, bola|
|452 | bonnet, poke bonnet|
|453 | bookcase|
|454 | bookshop, bookstore, bookstall|
|455 | bottlecap|
|456 | bow|
|457 | bow tie, bow-tie, bowtie|
|458 | brass, memorial tablet, plaque|
|459 | brassiere, bra, bandeau|
|460 | breakwater, groin, groyne, mole, bulwark, seawall, jetty|
|461 | breastplate, aegis, egis|
|462 | broom|
|463 | bucket, pail|
|464 | buckle|
|465 | bulletproof vest|
|466 | bullet train, bullet|
|467 | butcher shop, meat market|
|468 | cab, hack, taxi, taxicab|
|469 | caldron, cauldron|
|470 | candle, taper, wax light|
|471 | cannon|
|472 | canoe|
|473 | can opener, tin opener|
|474 | cardigan|
|475 | car mirror|
|476 | carousel, carrousel, merry-go-round, roundabout, whirligig|
|477 | carpenter's kit, tool kit|
|478 | carton|
|479 | car wheel|
|480 | cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM|
|481 | cassette|
|482 | cassette player|
|483 | castle|
|484 | catamaran|
|485 | CD player|
|486 | cello, violoncello|
|487 | cellular telephone, cellular phone, cellphone, cell, mobile phone|
|488 | chain|
|489 | chainlink fence|
|490 | chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour|
|491 | chain saw, chainsaw|
|492 | chest|
|493 | chiffonier, commode|
|494 | chime, bell, gong|
|495 | china cabinet, china closet|
|496 | Christmas stocking|
|497 | church, church building|
|498 | cinema, movie theater, movie theatre, movie house, picture palace|
|499 | cleaver, meat cleaver, chopper|
|500 | cliff dwelling|
|501 | cloak|
|502 | clog, geta, patten, sabot|
|503 | cocktail shaker|
|504 | coffee mug|
|505 | coffeepot|
|506 | coil, spiral, volute, whorl, helix|
|507 | combination lock|
|508 | computer keyboard, keypad|
|509 | confectionery, confectionary, candy store|
|510 | container ship, containership, container vessel|
|511 | convertible|
|512 | corkscrew, bottle screw|
|513 | cornet, horn, trumpet, trump|
|514 | cowboy boot|
|515 | cowboy hat, ten-gallon hat|
|516 | cradle|
|517 | crane_1|
|518 | crash helmet|
|519 | crate|
|520 | crib, cot|
|521 | Crock Pot|
|522 | croquet ball|
|523 | crutch|
|524 | cuirass|
|525 | dam, dike, dyke|
|526 | desk|
|527 | desktop computer|
|528 | dial telephone, dial phone|
|529 | diaper, nappy, napkin|
|530 | digital clock|
|531 | digital watch|
|532 | dining table, board|
|533 | dishrag, dishcloth|
|534 | dishwasher, dish washer, dishwashing machine|
|535 | disk brake, disc brake|
|536 | dock, dockage, docking facility|
|537 | dogsled, dog sled, dog sleigh|
|538 | dome|
|539 | doormat, welcome mat|
|540 | drilling platform, offshore rig|
|541 | drum, membranophone, tympan|
|542 | drumstick|
|543 | dumbbell|
|544 | Dutch oven|
|545 | electric fan, blower|
|546 | electric guitar|
|547 | electric locomotive|
|548 | entertainment center|
|549 | envelope|
|550 | espresso maker|
|551 | face powder|
|552 | feather boa, boa|
|553 | file, file cabinet, filing cabinet|
|554 | fireboat|
|555 | fire engine, fire truck|
|556 | fire screen, fireguard|
|557 | flagpole, flagstaff|
|558 | flute, transverse flute|
|559 | folding chair|
|560 | football helmet|
|561 | forklift|
|562 | fountain|
|563 | fountain pen|
|564 | four-poster|
|565 | freight car|
|566 | French horn, horn|
|567 | frying pan, frypan, skillet|
|568 | fur coat|
|569 | garbage truck, dustcart|
|570 | gasmask, respirator, gas helmet|
|571 | gas pump, gasoline pump, petrol pump, island dispenser|
|572 | goblet|
|573 | go-kart|
|574 | golf ball|
|575 | golfcart, golf cart|
|576 | gondola|
|577 | gong, tam-tam|
|578 | gown|
|579 | grand piano, grand|
|580 | greenhouse, nursery, glasshouse|
|581 | grille, radiator grille|
|582 | grocery store, grocery, food market, market|
|583 | guillotine|
|584 | hair slide|
|585 | hair spray|
|586 | half track|
|587 | hammer|
|588 | hamper|
|589 | hand blower, blow dryer, blow drier, hair dryer, hair drier|
|590 | hand-held computer, hand-held microcomputer|
|591 | handkerchief, hankie, hanky, hankey|
|592 | hard disc, hard disk, fixed disk|
|593 | harmonica, mouth organ, harp, mouth harp|
|594 | harp|
|595 | harvester, reaper|
|596 | hatchet|
|597 | holster|
|598 | home theater, home theatre|
|599 | honeycomb|
|600 | hook, claw|
|601 | hoopskirt, crinoline|
|602 | horizontal bar, high bar|
|603 | horse cart, horse-cart|
|604 | hourglass|
|605 | iPod|
|606 | iron, smoothing iron|
|607 | jack-o'-lantern|
|608 | jean, blue jean, denim|
|609 | jeep, landrover|
|610 | jersey, T-shirt, tee shirt|
|611 | jigsaw puzzle|
|612 | jinrikisha, ricksha, rickshaw|
|613 | joystick|
|614 | kimono|
|615 | knee pad|
|616 | knot|
|617 | lab coat, laboratory coat|
|618 | ladle|
|619 | lampshade, lamp shade|
|620 | laptop, laptop computer|
|621 | lawn mower, mower|
|622 | lens cap, lens cover|
|623 | letter opener, paper knife, paperknife|
|624 | library|
|625 | lifeboat|
|626 | lighter, light, igniter, ignitor|
|627 | limousine, limo|
|628 | liner, ocean liner|
|629 | lipstick, lip rouge|
|630 | Loafer|
|631 | lotion|
|632 | loudspeaker, speaker, speaker unit, loudspeaker system, speaker system|
|633 | loupe, jeweler's loupe|
|634 | lumbermill, sawmill|
|635 | magnetic compass|
|636 | mailbag, postbag|
|637 | mailbox, letter box|
|638 | maillot|
|639 | maillot, tank suit|
|640 | manhole cover|
|641 | maraca|
|642 | marimba, xylophone|
|643 | mask|
|644 | matchstick|
|645 | maypole|
|646 | maze, labyrinth|
|647 | measuring cup|
|648 | medicine chest, medicine cabinet|
|649 | megalith, megalithic structure|
|650 | microphone, mike|
|651 | microwave, microwave oven|
|652 | military uniform|
|653 | milk can|
|654 | minibus|
|655 | miniskirt, mini|
|656 | minivan|
|657 | missile|
|658 | mitten|
|659 | mixing bowl|
|660 | mobile home, manufactured home|
|661 | Model T|
|662 | modem|
|663 | monastery|
|664 | monitor|
|665 | moped|
|666 | mortar|
|667 | mortarboard|
|668 | mosque|
|669 | mosquito net|
|670 | motor scooter, scooter|
|671 | mountain bike, all-terrain bike, off-roader|
|672 | mountain tent|
|673 | mouse, computer mouse|
|674 | mousetrap|
|675 | moving van|
|676 | muzzle|
|677 | nail|
|678 | neck brace|
|679 | necklace|
|680 | nipple|
|681 | notebook, notebook computer|
|682 | obelisk|
|683 | oboe, hautboy, hautbois|
|684 | ocarina, sweet potato|
|685 | odometer, hodometer, mileometer, milometer|
|686 | oil filter|
|687 | organ, pipe organ|
|688 | oscilloscope, scope, cathode-ray oscilloscope, CRO|
|689 | overskirt|
|690 | oxcart|
|691 | oxygen mask|
|692 | packet|
|693 | paddle, boat paddle|
|694 | paddlewheel, paddle wheel|
|695 | padlock|
|696 | paintbrush|
|697 | pajama, pyjama, pj's, jammies|
|698 | palace|
|699 | panpipe, pandean pipe, syrinx|
|700 | paper towel|
|701 | parachute, chute|
|702 | parallel bars, bars|
|703 | park bench|
|704 | parking meter|
|705 | passenger car, coach, carriage|
|706 | patio, terrace|
|707 | pay-phone, pay-station|
|708 | pedestal, plinth, footstall|
|709 | pencil box, pencil case|
|710 | pencil sharpener|
|711 | perfume, essence|
|712 | Petri dish|
|713 | photocopier|
|714 | pick, plectrum, plectron|
|715 | pickelhaube|
|716 | picket fence, paling|
|717 | pickup, pickup truck|
|718 | pier|
|719 | piggy bank, penny bank|
|720 | pill bottle|
|721 | pillow|
|722 | ping-pong ball|
|723 | pinwheel|
|724 | pirate, pirate ship|
|725 | pitcher, ewer|
|726 | plane, carpenter's plane, woodworking plane|
|727 | planetarium|
|728 | plastic bag|
|729 | plate rack|
|730 | plow, plough|
|731 | plunger, plumber's helper|
|732 | Polaroid camera, Polaroid Land camera|
|733 | pole|
|734 | police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria|
|735 | poncho|
|736 | pool table, billiard table, snooker table|
|737 | pop bottle, soda bottle|
|738 | pot, flowerpot|
|739 | potter's wheel|
|740 | power drill|
|741 | prayer rug, prayer mat|
|742 | printer|
|743 | prison, prison house|
|744 | projectile, missile|
|745 | projector|
|746 | puck, hockey puck|
|747 | punching bag, punch bag, punching ball, punchball|
|748 | purse|
|749 | quill, quill pen|
|750 | quilt, comforter, comfort, puff|
|751 | racer, race car, racing car|
|752 | racket, racquet|
|753 | radiator|
|754 | radio, wireless|
|755 | radio telescope, radio reflector|
|756 | rain barrel|
|757 | recreational vehicle, RV, R.V.|
|758 | reel|
|759 | reflex camera|
|760 | refrigerator, icebox|
|761 | remote control, remote|
|762 | restaurant, eating house, eating place, eatery|
|763 | revolver, six-gun, six-shooter|
|764 | rifle|
|765 | rocking chair, rocker|
|766 | rotisserie|
|767 | rubber eraser, rubber, pencil eraser|
|768 | rugby ball|
|769 | rule, ruler|
|770 | running shoe|
|771 | safe|
|772 | safety pin|
|773 | saltshaker, salt shaker|
|774 | sandal|
|775 | sarong|
|776 | sax, saxophone|
|777 | scabbard|
|778 | scale, weighing machine|
|779 | school bus|
|780 | schooner|
|781 | scoreboard|
|782 | screen, CRT screen|
|783 | screw|
|784 | screwdriver|
|785 | seat belt, seatbelt|
|786 | sewing machine|
|787 | shield, buckler|
|788 | shoe shop, shoe-shop, shoe store|
|789 | shoji|
|790 | shopping basket|
|791 | shopping cart|
|792 | shovel|
|793 | shower cap|
|794 | shower curtain|
|795 | ski|
|796 | ski mask|
|797 | sleeping bag|
|798 | slide rule, slipstick|
|799 | sliding door|
|800 | slot, one-armed bandit|
|801 | snorkel|
|802 | snowmobile|
|803 | snowplow, snowplough|
|804 | soap dispenser|
|805 | soccer ball|
|806 | sock|
|807 | solar dish, solar collector, solar furnace|
|808 | sombrero|
|809 | soup bowl|
|810 | space bar|
|811 | space heater|
|812 | space shuttle|
|813 | spatula|
|814 | speedboat|
|815 | spider web, spider's web|
|816 | spindle|
|817 | sports car, sport car|
|818 | spotlight, spot|
|819 | stage|
|820 | steam locomotive|
|821 | steel arch bridge|
|822 | steel drum|
|823 | stethoscope|
|824 | stole|
|825 | stone wall|
|826 | stopwatch, stop watch|
|827 | stove|
|828 | strainer|
|829 | streetcar, tram, tramcar, trolley, trolley car|
|830 | stretcher|
|831 | studio couch, day bed|
|832 | stupa, tope|
|833 | submarine, pigboat, sub, U-boat|
|834 | suit, suit of clothes|
|835 | sundial|
|836 | sunglass|
|837 | sunglasses, dark glasses, shades|
|838 | sunscreen, sunblock, sun blocker|
|839 | suspension bridge|
|840 | swab, swob, mop|
|841 | sweatshirt|
|842 | swimming trunks, bathing trunks|
|843 | swing|
|844 | switch, electric switch, electrical switch|
|845 | syringe|
|846 | table lamp|
|847 | tank, army tank, armored combat vehicle, armoured combat vehicle|
|848 | tape player|
|849 | teapot|
|850 | teddy, teddy bear|
|851 | television, television system|
|852 | tennis ball|
|853 | thatch, thatched roof|
|854 | theater curtain, theatre curtain|
|855 | thimble|
|856 | thresher, thrasher, threshing machine|
|857 | throne|
|858 | tile roof|
|859 | toaster|
|860 | tobacco shop, tobacconist shop, tobacconist|
|861 | toilet seat|
|862 | torch|
|863 | totem pole|
|864 | tow truck, tow car, wrecker|
|865 | toyshop|
|866 | tractor|
|867 | trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi|
|868 | tray|
|869 | trench coat|
|870 | tricycle, trike, velocipede|
|871 | trimaran|
|872 | tripod|
|873 | triumphal arch|
|874 | trolleybus, trolley coach, trackless trolley|
|875 | trombone|
|876 | tub, vat|
|877 | turnstile|
|878 | typewriter keyboard|
|879 | umbrella|
|880 | unicycle, monocycle|
|881 | upright, upright piano|
|882 | vacuum, vacuum cleaner|
|883 | vase|
|884 | vault|
|885 | velvet|
|886 | vending machine|
|887 | vestment|
|888 | viaduct|
|889 | violin, fiddle|
|890 | volleyball|
|891 | waffle iron|
|892 | wall clock|
|893 | wallet, billfold, notecase, pocketbook|
|894 | wardrobe, closet, press|
|895 | warplane, military plane|
|896 | washbasin, handbasin, washbowl, lavabo, wash-hand basin|
|897 | washer, automatic washer, washing machine|
|898 | water bottle|
|899 | water jug|
|900 | water tower|
|901 | whiskey jug|
|902 | whistle|
|903 | wig|
|904 | window screen|
|905 | window shade|
|906 | Windsor tie|
|907 | wine bottle|
|908 | wing|
|909 | wok|
|910 | wooden spoon|
|911 | wool, woolen, woollen|
|912 | worm fence, snake fence, snake-rail fence, Virginia fence|
|913 | wreck|
|914 | yawl|
|915 | yurt|
|916 | web site, website, internet site, site|
|917 | comic book|
|918 | crossword puzzle, crossword|
|919 | street sign|
|920 | traffic light, traffic signal, stoplight|
|921 | book jacket, dust cover, dust jacket, dust wrapper|
|922 | menu|
|923 | plate|
|924 | guacamole|
|925 | consomme|
|926 | hot pot, hotpot|
|927 | trifle|
|928 | ice cream, icecream|
|929 | ice lolly, lolly, lollipop, popsicle|
|930 | French loaf|
|931 | bagel, beigel|
|932 | pretzel|
|933 | cheeseburger|
|934 | hotdog, hot dog, red hot|
|935 | mashed potato|
|936 | head cabbage|
|937 | broccoli|
|938 | cauliflower|
|939 | zucchini, courgette|
|940 | spaghetti squash|
|941 | acorn squash|
|942 | butternut squash|
|943 | cucumber, cuke|
|944 | artichoke, globe artichoke|
|945 | bell pepper|
|946 | cardoon|
|947 | mushroom|
|948 | Granny Smith|
|949 | strawberry|
|950 | orange|
|951 | lemon|
|952 | fig|
|953 | pineapple, ananas|
|954 | banana|
|955 | jackfruit, jak, jack|
|956 | custard apple|
|957 | pomegranate|
|958 | hay|
|959 | carbonara|
|960 | chocolate sauce, chocolate syrup|
|961 | dough|
|962 | meat loaf, meatloaf|
|963 | pizza, pizza pie|
|964 | potpie|
|965 | burrito|
|966 | red wine|
|967 | espresso|
|968 | cup|
|969 | eggnog|
|970 | alp|
|971 | bubble|
|972 | cliff, drop, drop-off|
|973 | coral reef|
|974 | geyser|
|975 | lakeside, lakeshore|
|976 | promontory, headland, head, foreland|
|977 | sandbar, sand bar|
|978 | seashore, coast, seacoast, sea-coast|
|979 | valley, vale|
|980 | volcano|
|981 | ballplayer, baseball player|
|982 | groom, bridegroom|
|983 | scuba diver|
|984 | rapeseed|
|985 | daisy|
|986 | yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum|
|987 | corn|
|988 | acorn|
|989 | hip, rose hip, rosehip|
|990 | buckeye, horse chestnut, conker|
|991 | coral fungus|
|992 | agaric|
|993 | gyromitra|
|994 | stinkhorn, carrion fungus|
|995 | earthstar|
|996 | hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa|
|997 | bolete|
|998 | ear, spike, capitulum|
|999 | toilet tissue, toilet paper, bathroom tissue|
</details>
### Data Splits
This dataset is a validation-only set.
## Dataset Creation
### Source Data
This dataset is sourced from ImageNet, ImageNet-ReaL, ImageNet-V2, ImageNet-A, ImageNet-C, ImageNet-R, ImageNet-Sketch, and ObjectNet.
## Citation Information
```
@article{taesiri2023zoom,
title={ImageNet-Hard: The Hardest Images Remaining from a Study of the Power of Zoom and Spatial Biases in Image Classification},
author={Taesiri, Mohammad Reza and Nguyen, Giang and Habchi, Sarra and Bezemer, Cor-Paul and Nguyen, Anh},
journal={arXiv preprint arXiv:2304.05538},
year={2023}
}
``` |
philikai/spider_SQL_PALM_Prompt | ---
license: cc-by-sa-4.0
---
Dataset for creating prompts for fine-tuning on Spider Dataset with Foreign and Primary Key Information as well as Schema information. |
veroinesc/test1 | ---
license: unknown
---
|
isaiahbjork/alpaca-function-calling-json | ---
license: mit
---
|
silviaarellano/heightmaps | ---
license: mit
task_categories:
- image-to-image
tags:
- maps
- heightmaps
--- |
rastogi/FNS | ---
language:
- en
--- |
Chris126/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 0
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thangvip/luat-2020-2023 | ---
dataset_info:
features:
- name: content
dtype: string
- name: citation
dtype: string
- name: meta
struct:
- name: effective_date
dtype: string
- name: issuing_agency
dtype: string
- name: promulgation_date
dtype: string
- name: sign_number
dtype: string
- name: signer
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: luat
num_bytes: 7414708.290523045
num_examples: 2698
download_size: 3681024
dataset_size: 7414708.290523045
configs:
- config_name: default
data_files:
- split: luat
path: data/luat-*
---
|
infinilabs/app-downloading-logs-nginx-dataset | ---
license: apache-2.0
---
# Summary
This dataset can be freely used for AI research.
```
root@infini:/# cat infini-release.log |wc -l
1550474
root@infini:/# head -n 10 infini-release.log
175.10.73.78 - - [16/Mar/2022:15:36:35 +0800] "GET / HTTP/1.1" 200 331 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:36 +0800] "GET / HTTP/1.1" 200 331 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:36 +0800] "GET / HTTP/1.1" 200 331 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:37 +0800] "GET /gateway/ HTTP/1.1" 200 331 "http://release.infinilabs.com/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:37 +0800] "GET /gateway/stable/ HTTP/1.1" 200 319 "http://release.infinilabs.com/gateway/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:38 +0800] "GET /gateway/stable/archive/ HTTP/1.1" 200 345 "http://release.infinilabs.com/gateway/stable/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:39 +0800] "GET /gateway/stable/ HTTP/1.1" 200 319 "http://release.infinilabs.com/gateway/stable/archive/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:39 +0800] "GET /gateway/ HTTP/1.1" 200 331 "http://release.infinilabs.com/gateway/stable/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:40 +0800] "GET /gateway/snapshot/ HTTP/1.1" 200 567 "http://release.infinilabs.com/gateway/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
175.10.73.78 - - [16/Mar/2022:15:36:41 +0800] "GET /gateway/ HTTP/1.1" 200 331 "http://release.infinilabs.com/gateway/snapshot/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36"
```
# License
The dataset is licensed under the Apache 2.0 license agreement.
# Contact
- [http://infinilabs.com](http://infinilabs.com)
- hello#infini.ltd |
jinho8345/funsd-bioes | ---
dataset_info:
features:
- name: img
dtype: image
- name: labels
sequence: string
- name: words
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: filename
dtype: string
splits:
- name: train
num_bytes: 13281297.0
num_examples: 149
- name: val
num_bytes: 4777640.0
num_examples: 50
download_size: 16636452
dataset_size: 18058937.0
---
# Dataset Card for "funsd-bies"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Team-PIXEL/PIXELSum_uk_wiki_for_TA | ---
license: apache-2.0
dataset_info:
features:
- name: text
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: target
dtype: string
- name: num_text_patches
dtype: int64
splits:
- name: train
num_bytes: 40401004873
num_examples: 3968117
download_size: 38963135270
dataset_size: 40401004873
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Norod78/cartoon-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 190959102.953
num_examples: 3141
download_size: 190279356
dataset_size: 190959102.953
pretty_name: 'Cartoon BLIP captions'
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
---
# Dataset Card for "cartoon-blip-captions"
|
torchgeo/l7irish | ---
task_categories:
- image-segmentation
tags:
- climate
pretty_name: L7 Irish
size_categories:
- n<1K
license: cc0-1.0
---
Redistribution of data from https://www.sciencebase.gov/catalog/item/573ccf18e4b0dae0d5e4b109. Some files renamed for consistency. Corrupted or missing files replaced with data from https://landsat.usgs.gov/landsat-7-cloud-cover-assessment-validation-data.
Landsat Data Distribution Policy: https://www.usgs.gov/media/files/landsat-data-distribution-policy |
japanese-asr/whisper_transcriptions.reazonspeech.large.wer_10.0.vectorized | ---
dataset_info:
config_name: large
features:
- name: input_length
dtype: int64
- name: labels
sequence: int64
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1602595794864
num_examples: 1042883
download_size: 296247756094
dataset_size: 1602595794864
configs:
- config_name: large
data_files:
- split: train
path: large/train-*
---
|
Dmenorsz/mckevin | ---
license: openrail
---
|
chloeliu/reddit_nosleep_posts | ---
license: unknown
---
|
jungledude23/llama-subtitle-mini | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 980853
num_examples: 447
download_size: 184428
dataset_size: 980853
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sayakpaul/ucf101-subset | ---
license: apache-2.0
---
This dataset repository contains a subset of the UCF-101 dataset [1]. The subset archive was obtained using the code from [this guide](https://www.tensorflow.org/tutorials/load_data/video).
### References
[1] UCF101: A Dataset of 101 Human Actions Classes From Videos in The Wild, https://arxiv.org/abs/1212.0402. |
abhishek-mungoli/srk_wittiness | ---
license: cc
---
# Shah Rukh Khan Wittiness Reply DataSet
Shah Rukh Khan is known for his wit, humor, and charm. I have carefully crafted some questions and imagined how SRK would have replied to them using Chat-GPT. You can use this data to train a smaller open-source language model or for any other use case you have in mind.
|
BatsResearch/sib200-LexC-Gen | ---
language:
- tum
- ee
- ln
- fj
- ts
- bm
- sg
- ak
- lus
- gn
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- topic-classification
tags:
- news-topic
- sib-200
- sib200
- synthetic
dataset_info:
- config_name: ak_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3587478
num_examples: 22062
- name: validation
num_bytes: 14755
num_examples: 99
download_size: 2185047
dataset_size: 3602233
- config_name: ak_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 370304
num_examples: 2271
- name: validation
num_bytes: 14755
num_examples: 99
download_size: 239976
dataset_size: 385059
- config_name: ak_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 36361
num_examples: 229
- name: validation
num_bytes: 14755
num_examples: 99
download_size: 37326
dataset_size: 51116
- config_name: bm_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3796341
num_examples: 19972
- name: validation
num_bytes: 15791
num_examples: 99
download_size: 2248093
dataset_size: 3812132
- config_name: bm_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 385755
num_examples: 2257
- name: validation
num_bytes: 15791
num_examples: 99
download_size: 245275
dataset_size: 401546
- config_name: bm_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 39450
num_examples: 201
- name: validation
num_bytes: 15791
num_examples: 99
download_size: 39023
dataset_size: 55241
- config_name: ee_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3845466
num_examples: 22352
- name: validation
num_bytes: 15477
num_examples: 99
download_size: 2312846
dataset_size: 3860943
- config_name: ee_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 385266
num_examples: 2230
- name: validation
num_bytes: 15477
num_examples: 99
download_size: 245696
dataset_size: 400743
- config_name: ee_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 43044
num_examples: 252
- name: validation
num_bytes: 15477
num_examples: 99
download_size: 41559
dataset_size: 58521
- config_name: fj_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3720751
num_examples: 22343
- name: validation
num_bytes: 15135
num_examples: 99
download_size: 2211095
dataset_size: 3735886
- config_name: fj_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 367761
num_examples: 2224
- name: validation
num_bytes: 15135
num_examples: 99
download_size: 231436
dataset_size: 382896
- config_name: fj_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 37902
num_examples: 228
- name: validation
num_bytes: 15135
num_examples: 99
download_size: 38113
dataset_size: 53037
- config_name: gn_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4493339
num_examples: 22231
- name: validation
num_bytes: 17407
num_examples: 99
download_size: 2430340
dataset_size: 4510746
- config_name: gn_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 453561
num_examples: 2229
- name: validation
num_bytes: 17407
num_examples: 99
download_size: 258889
dataset_size: 470968
- config_name: gn_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 45320
num_examples: 217
- name: validation
num_bytes: 17407
num_examples: 99
download_size: 40876
dataset_size: 62727
- config_name: ln_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3925088
num_examples: 22445
- name: validation
num_bytes: 15683
num_examples: 99
download_size: 2255900
dataset_size: 3940771
- config_name: ln_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 393944
num_examples: 2231
- name: validation
num_bytes: 15683
num_examples: 99
download_size: 240507
dataset_size: 409627
- config_name: ln_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 38076
num_examples: 223
- name: validation
num_bytes: 15683
num_examples: 99
download_size: 37096
dataset_size: 53759
- config_name: lus_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3807289
num_examples: 22331
- name: validation
num_bytes: 15417
num_examples: 99
download_size: 2266155
dataset_size: 3822706
- config_name: lus_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 386103
num_examples: 2266
- name: validation
num_bytes: 15417
num_examples: 99
download_size: 244118
dataset_size: 401520
- config_name: lus_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 37926
num_examples: 218
- name: validation
num_bytes: 15417
num_examples: 99
download_size: 37815
dataset_size: 53343
- config_name: sg_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3843121
num_examples: 21752
- name: validation
num_bytes: 15569
num_examples: 99
download_size: 2211613
dataset_size: 3858690
- config_name: sg_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 387784
num_examples: 2203
- name: validation
num_bytes: 15569
num_examples: 99
download_size: 237669
dataset_size: 403353
- config_name: sg_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 37561
num_examples: 212
- name: validation
num_bytes: 15569
num_examples: 99
download_size: 37004
dataset_size: 53130
- config_name: ts_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3661185
num_examples: 20069
- name: validation
num_bytes: 15126
num_examples: 99
download_size: 2290947
dataset_size: 3676311
- config_name: ts_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 377366
num_examples: 2079
- name: validation
num_bytes: 15126
num_examples: 99
download_size: 251583
dataset_size: 392492
- config_name: ts_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 35059
num_examples: 188
- name: validation
num_bytes: 15126
num_examples: 99
download_size: 37964
dataset_size: 50185
- config_name: tum_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4117789
num_examples: 21667
- name: validation
num_bytes: 15922
num_examples: 99
download_size: 2480890
dataset_size: 4133711
- config_name: tum_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 415921
num_examples: 2344
- name: validation
num_bytes: 15922
num_examples: 99
download_size: 262403
dataset_size: 431843
- config_name: tum_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 39665
num_examples: 209
- name: validation
num_bytes: 15922
num_examples: 99
download_size: 39937
dataset_size: 55587
configs:
- config_name: ak_100k
data_files:
- split: train
path: ak_100k/train-*
- split: validation
path: ak_100k/validation-*
- config_name: ak_10k
data_files:
- split: train
path: ak_10k/train-*
- split: validation
path: ak_10k/validation-*
- config_name: ak_1k
data_files:
- split: train
path: ak_1k/train-*
- split: validation
path: ak_1k/validation-*
- config_name: bm_100k
data_files:
- split: train
path: bm_100k/train-*
- split: validation
path: bm_100k/validation-*
- config_name: bm_10k
data_files:
- split: train
path: bm_10k/train-*
- split: validation
path: bm_10k/validation-*
- config_name: bm_1k
data_files:
- split: train
path: bm_1k/train-*
- split: validation
path: bm_1k/validation-*
- config_name: ee_100k
data_files:
- split: train
path: ee_100k/train-*
- split: validation
path: ee_100k/validation-*
- config_name: ee_10k
data_files:
- split: train
path: ee_10k/train-*
- split: validation
path: ee_10k/validation-*
- config_name: ee_1k
data_files:
- split: train
path: ee_1k/train-*
- split: validation
path: ee_1k/validation-*
- config_name: fj_100k
data_files:
- split: train
path: fj_100k/train-*
- split: validation
path: fj_100k/validation-*
- config_name: fj_10k
data_files:
- split: train
path: fj_10k/train-*
- split: validation
path: fj_10k/validation-*
- config_name: fj_1k
data_files:
- split: train
path: fj_1k/train-*
- split: validation
path: fj_1k/validation-*
- config_name: gn_100k
data_files:
- split: train
path: gn_100k/train-*
- split: validation
path: gn_100k/validation-*
- config_name: gn_10k
data_files:
- split: train
path: gn_10k/train-*
- split: validation
path: gn_10k/validation-*
- config_name: gn_1k
data_files:
- split: train
path: gn_1k/train-*
- split: validation
path: gn_1k/validation-*
- config_name: ln_100k
data_files:
- split: train
path: ln_100k/train-*
- split: validation
path: ln_100k/validation-*
- config_name: ln_10k
data_files:
- split: train
path: ln_10k/train-*
- split: validation
path: ln_10k/validation-*
- config_name: ln_1k
data_files:
- split: train
path: ln_1k/train-*
- split: validation
path: ln_1k/validation-*
- config_name: lus_100k
data_files:
- split: train
path: lus_100k/train-*
- split: validation
path: lus_100k/validation-*
- config_name: lus_10k
data_files:
- split: train
path: lus_10k/train-*
- split: validation
path: lus_10k/validation-*
- config_name: lus_1k
data_files:
- split: train
path: lus_1k/train-*
- split: validation
path: lus_1k/validation-*
- config_name: sg_100k
data_files:
- split: train
path: sg_100k/train-*
- split: validation
path: sg_100k/validation-*
- config_name: sg_10k
data_files:
- split: train
path: sg_10k/train-*
- split: validation
path: sg_10k/validation-*
- config_name: sg_1k
data_files:
- split: train
path: sg_1k/train-*
- split: validation
path: sg_1k/validation-*
- config_name: ts_100k
data_files:
- split: train
path: ts_100k/train-*
- split: validation
path: ts_100k/validation-*
- config_name: ts_10k
data_files:
- split: train
path: ts_10k/train-*
- split: validation
path: ts_10k/validation-*
- config_name: ts_1k
data_files:
- split: train
path: ts_1k/train-*
- split: validation
path: ts_1k/validation-*
- config_name: tum_100k
data_files:
- split: train
path: tum_100k/train-*
- split: validation
path: tum_100k/validation-*
- config_name: tum_10k
data_files:
- split: train
path: tum_10k/train-*
- split: validation
path: tum_10k/validation-*
- config_name: tum_1k
data_files:
- split: train
path: tum_1k/train-*
- split: validation
path: tum_1k/validation-*
pretty_name: LexC-Gen generated data for SIB-200
---
# Dataset Card for sib200-LexC-Gen
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Project Page](https://batsresearch.github.io/lexcgen/)
- **Repository:** [Github Repo](https://github.com/BatsResearch/LexC-Gen)
- **Paper:** [Arxiv](https://arxiv.org/abs/2402.14086)
- **Point of Contact:** [Zheng-Xin Yong](mailto:contact.yong@brown.edu)
### Dataset Summary
The LexC-Gen dataset for [SIB-200 topic classification](https://huggingface.co/datasets/Davlan/sib200) task is a dataset generated for low-resource languages at scale with Large Language Models ([BLOOMZ-7.1B](https://arxiv.org/abs/2211.01786)) and [Gatitos bilingual lexicons](https://aclanthology.org/2023.emnlp-main.26/).
```python3
from datasets import load_dataset
dataset = load_dataset("BatsResearch/sib200-LexC-Gen", "gn_100k")
```
### Supported Tasks and Leaderboards
- `text-classification`, `topic-classification`: The dataset can be used to train a model for topic classification. The model performance is evaluated based on the accuracy of the predicted labels as compared to the given labels in the dataset.
### Languages
The text cover 10 extremely low-resource languages:
- Tumbuka (`tum`)
- Ewe (`ee`)
- Lingala (`ln`)
- Fijian (`fj`)
- Tsonga (`ts`)
- Bambara (`bm`)
- Sango (`sg`)
- Twi (`ak`)
- Mizo (`lus`)
- Guarani (`gn`)
## Dataset Structure
### Data Instances
Each data instance contains the following features: _id_, _text_ and _label_. The _label_ has 7 possible values (0 to 6), which respectively correspond to
```
["science/technology", "travel", "politics", "sports", "health", "entertainment", "geography"]
```
An example from the LexC-Gen train set looks like the following:
```
{'id': '1',
'text': 'Mr. Smith ( ha'e narrator ) says péva peteĩva yvypóra jepy'amongeta péva taking drugs ikatu japo hikuái "" ñandu iporã "" . He ends rupi saying péva drugs oĩ iñangave'ỹva , ha opaite arapygua va'erã ha'ã g̃uarã-hag̃ua jehekýi using hikuái .'
'label': 4}
```
### Data Fields
- 'id': unique id
- 'text': generated text from LLMs
- 'label': an integer.
### Data Splits
The LexC-Gen dataset requires config name in the format of `{lang}_{size}`. The `lang` refers to the language code, and the `size` refers to the size of LexC-Gen dataset before input-label consistency filtering, which takes values of `1k`, `10k`, or `100k`.
The LexC-Gen dataset has 2 splits: _train_, _validation_. The _train_ split refers to the generated LexC-Gen task training data. The _validation_ split refers to the SIB-200 validation data that has been word translated.
## Dataset Creation
### Curation Rationale
Extremely low-resource languages have virtually no labeled data. We explore generating data at scale for these languages using high-resource-language task data, LLMs, and bilingual lexicons to overcome the data bottleneck.
We upload the dataset to Huggingface as an artifact of our research and to ensure reproducibility of our results in our paper.
## Considerations for Using the Data
Our dataset is a synthetic dataset generated in English by LLMs and then translated into low-resource languages through word-to-word translation with bilingual lexicons.
It may contain English words due to imperfect translation, and it uses English syntax such as SVO word order, which is not necessarily representative of the syntax of the low-resource languages.
## Additional Information
### Dataset Curators
The LexC-Gen synthetic dataset is created by Zheng-Xin Yong.
### Licensing Information
Our dataset is generated from BLOOMZ models, which uses the [BigScience RAIL License v1.0](https://huggingface.co/spaces/bigscience/license). Therefore, the RAIL license would apply to classifiers that are finetuned on our LexC-Gen dataset.
### Citation Information
```
@misc{yong2024lexcgen,
title={LexC-Gen: Generating Data for Extremely Low-Resource Languages with Large Language Models and Bilingual Lexicons},
author={Zheng-Xin Yong and Cristina Menghini and Stephen H. Bach},
year={2024},
eprint={2402.14086},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
result-kand2-sdxl-wuerst-karlo/9e7f6f37 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 152
num_examples: 10
download_size: 1303
dataset_size: 152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "9e7f6f37"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ikorbiak/my_first_dataset | ---
license: apache-2.0
---
|
davanstrien/ia-loaded-embedded-gpu | ---
dataset_info:
features:
- name: crawl_date
dtype: int64
- name: last_modified_date
dtype: float64
- name: url
dtype: string
- name: filename
dtype: string
- name: extension
dtype: string
- name: mime_type_web_server
dtype: string
- name: mime_type_tika
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: md5
dtype: string
- name: sha1
dtype: string
- name: image
dtype: image
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 1542994483.25
num_examples: 4582
download_size: 1515373521
dataset_size: 1542994483.25
---
# Dataset Card for "ia-loaded-embedded-gpu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bunkalab/topic_based_chatml_dpo_pairs | ---
license: apache-2.0
language:
- en
---
# DPO Pairs
This is a preprocessed version of [mlabonne/chatml_dpo_pairs](https://huggingface.co/datasets/mlabonne/chatml_dpo_pairs) using [Bunkatopics](https://github.com/charlesdedampierre/BunkaTopics) to extract meaningful Topics that help models converge with less data.
The objective was to create a smaller dataset than the original but buy keeping its efficiecency.To achieve this, we compared the two datasets used to train the reward model in [mlabonne/chatml_dpo_pairs](https://huggingface.co/datasets/mlabonne/chatml_dpo_pairs): the rejected Llama answers and the accepted ChatGPT answers from the DPO dataset.
We then conducted topic modeling on both datasets, keeping only the topics that existed in the accepted dataset but not in the rejected one. Our hypothesis is that these topics encapsulate the main differences between the two answering styles.
This method allows for quicker convergence with significantly less data (around 1/6 of the initial dataset).
See the page of the model test [here](https://huggingface.co/charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B)
# Topic Analysis
We applied the topic modeling method to both datasets, extracting 30 topics from each. These topics were characterized using the 10 most specific unigrams or bigrams. We then compared the two sets of topics (30 from each dataset) and retained those in the accepted dataset that shared fewer than 2 terms with any topic in the rejected dataset
We found the 13 distincitve following topics described by 10 terms each:
**Emotional Dynamics**: feelings, Quinn, Austin, minority women, teaching, schools, individual, personality, backgrounds, triggers.
**Global Knowledge Queries**: question, information, geography, news articles, Step, answer, capital city, pipeline system, country, analogy.
**Digital Interactions and Queries**: questions, question, PersonX, modem, answers, effect relationship, Quora, browser, answer, e-commerce.
**Business and Cybersecurity**: email, businesses, initiatives, innovation, advertising papers, spam, breaches, antivirus, payments, prospects.
**Lifestyle and Wellness**: sleep, exercise, gifts, shopping, Casey, stores, stress, headaches, options, mood.
**Wildlife Ecology**: birds, prey, animals, species, infection, nest, eggs, bacteria, insects, kitty condo.
**Environmental Science and Climate**: temperature, gases, greenhouse, emissions, perturbation, sulfur, dioxide, climate change, water, heat.
**Maritime and Mechanical Engineering**: ship, bowling, propulsion, beam width, Filing cabinet, LED, lane, containment area, lawnmower, rotors.
**Cultural and Social Dynamics**: Lindsey, museum, Kate, Rachel, Jason, Alex, Erin, conversation, Laura, exhibits.
**Political Media Analysis**: media platforms, election, politics, teenagers, elections, White House, Barack Obama, nation, Confederate, depression.
**International Relations and Policy**: cooperation, EU, nations, alliance, NATO, European Union, member states, policy, monarch, Brexit.
**Astrophysics and Physical Sciences**: electrons, km, Moon, acceleration, orbit, friction, current, asteroid, electron, collector emitter.
**Film Critique and Analysis**: movie review, film, reviewer, sentiment, critic, flaws, DVD, plot, opinion, originality.
While those topics are not domain-specific, they did not appear right away in the rejected dataset. Further research need to undersand the reason behind the prominence of those topics in the accepted dataset.
# Load Dataset
```python
dataset = load_dataset("bunkalab/topic_based_chatml_dpo_pairs")['train']
``` |
shreyasharma/ret_sentence_eval | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: label
dtype: int64
- name: docs
dtype: string
splits:
- name: test
num_bytes: 1973426
num_examples: 6104
- name: train
num_bytes: 3753896
num_examples: 11528
- name: val
num_bytes: 528670
num_examples: 1632
download_size: 2499468
dataset_size: 6255992
---
# Dataset Card for "ret_sentence_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fxmeng/mmlu_one_line | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: dev
num_bytes: 119303
num_examples: 285
- name: val
num_bytes: 730743
num_examples: 1531
- name: test
num_bytes: 6667440
num_examples: 14042
download_size: 3474542
dataset_size: 7517486
---
# Dataset Card for "mmlu_one_line"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mindcraft-nl/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 16600004
num_examples: 3000
download_size: 4692861
dataset_size: 16600004
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/makigumo_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of makigumo/巻雲 (Kantai Collection)
This is the dataset of makigumo/巻雲 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `pink_hair, glasses, ahoge, twintails, long_hair, yellow_eyes, hair_bun, double_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 540.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makigumo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 318.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makigumo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1134 | 655.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makigumo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 483.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makigumo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1134 | 903.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makigumo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/makigumo_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | 1girl, school_uniform, solo, sleeves_past_fingers, skirt, pantyhose, looking_at_viewer, open_mouth, long_sleeves, boots, blush, smile |
| 1 | 17 |  |  |  |  |  | 1girl, long_sleeves, sleeves_past_fingers, solo, white_shirt, bowtie, halterneck, school_uniform, dress, grey_pantyhose, looking_at_viewer, simple_background, white_background, full_body, lace-up_boots, open_mouth, smile, cowboy_shot, standing |
| 2 | 14 |  |  |  |  |  | 1girl, blazer, long_sleeves, solo, grey_thighhighs, school_uniform, looking_at_viewer, halterneck, smile, purple_dress, blue-framed_eyewear, cowboy_shot, crown_braid, open_mouth, aqua_bowtie |
| 3 | 6 |  |  |  |  |  | 1girl, alternate_costume, blue_buruma, full_body, gym_shirt, gym_uniform, looking_at_viewer, short_sleeves, simple_background, solo, uwabaki, white_background, white_shirt, white_socks, from_behind, kneehighs, t-shirt, ass, looking_back, standing |
| 4 | 5 |  |  |  |  |  | 1girl, alternate_costume, blue-framed_eyewear, blue_buruma, cowboy_shot, gym_uniform, looking_at_viewer, short_sleeves, simple_background, solo, t-shirt, white_shirt, standing, white_background, gym_shirt, open_mouth, smile |
| 5 | 7 |  |  |  |  |  | 1girl, casual_one-piece_swimsuit, looking_at_viewer, simple_background, solo, tied_shirt, white_shirt, floral_print, full_body, grey_background, sandals, standing, swimsuit_under_clothes, t-shirt, white_background, barefoot |
| 6 | 6 |  |  |  |  |  | 1girl, casual_one-piece_swimsuit, looking_at_viewer, solo, tied_shirt, white_shirt, beachball, cowboy_shot, open_mouth, smile, simple_background, t-shirt, white_background |
| 7 | 19 |  |  |  |  |  | 1girl, competition_school_swimsuit, looking_at_viewer, alternate_costume, solo, blue_one-piece_swimsuit, simple_background, smile, white_background, open_mouth, cowboy_shot, sidelocks, blue-framed_eyewear, flat_chest |
| 8 | 11 |  |  |  |  |  | side-tie_bikini_bottom, 1girl, looking_at_viewer, solo, frilled_bikini, hair_flower, navel, floral_print, mismatched_bikini, open_mouth, cowboy_shot, blue-framed_eyewear, crown_braid, open_shirt, purple_bikini, sandals, simple_background, white_background |
| 9 | 5 |  |  |  |  |  | 1girl, alternate_costume, bowtie, detached_collar, fake_animal_ears, full_body, looking_at_viewer, playboy_bunny, rabbit_ears, simple_background, solo, strapless_leotard, wrist_cuffs, high_heels, open_mouth, rabbit_tail, white_background, breasts, grey_pantyhose, red_leotard, black_footwear, black_leotard, black_pantyhose, kneeling, smile |
| 10 | 8 |  |  |  |  |  | 1girl, enmaided, maid_bikini, maid_headdress, small_breasts, waist_apron, looking_at_viewer, solo, white_thighhighs, black_bikini, simple_background, wrist_cuffs, bowtie, detached_collar, black_footwear, mary_janes, bangs, blue-framed_eyewear, blush, frills, full_body, white_apron, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | sleeves_past_fingers | skirt | pantyhose | looking_at_viewer | open_mouth | long_sleeves | boots | blush | smile | white_shirt | bowtie | halterneck | dress | grey_pantyhose | simple_background | white_background | full_body | lace-up_boots | cowboy_shot | standing | blazer | grey_thighhighs | purple_dress | blue-framed_eyewear | crown_braid | aqua_bowtie | alternate_costume | blue_buruma | gym_shirt | gym_uniform | short_sleeves | uwabaki | white_socks | from_behind | kneehighs | t-shirt | ass | looking_back | casual_one-piece_swimsuit | tied_shirt | floral_print | grey_background | sandals | swimsuit_under_clothes | barefoot | beachball | competition_school_swimsuit | blue_one-piece_swimsuit | sidelocks | flat_chest | side-tie_bikini_bottom | frilled_bikini | hair_flower | navel | mismatched_bikini | open_shirt | purple_bikini | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | high_heels | rabbit_tail | breasts | red_leotard | black_footwear | black_leotard | black_pantyhose | kneeling | enmaided | maid_bikini | maid_headdress | small_breasts | waist_apron | white_thighhighs | black_bikini | mary_janes | bangs | frills | white_apron |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-------|:-----------------------|:--------|:------------|:--------------------|:-------------|:---------------|:--------|:--------|:--------|:--------------|:---------|:-------------|:--------|:-----------------|:--------------------|:-------------------|:------------|:----------------|:--------------|:-----------|:---------|:------------------|:---------------|:----------------------|:--------------|:--------------|:--------------------|:--------------|:------------|:--------------|:----------------|:----------|:--------------|:--------------|:------------|:----------|:------|:---------------|:----------------------------|:-------------|:---------------|:------------------|:----------|:-------------------------|:-----------|:------------|:------------------------------|:--------------------------|:------------|:-------------|:-------------------------|:-----------------|:--------------|:--------|:--------------------|:-------------|:----------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:-------------|:--------------|:----------|:--------------|:-----------------|:----------------|:------------------|:-----------|:-----------|:--------------|:-----------------|:----------------|:--------------|:-------------------|:---------------|:-------------|:--------|:---------|:--------------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | | | | X | X | X | | | X | | | X | | | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | X | | | | | | X | | | | | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | | X | X | | | | X | X | | | | | X | X | | | X | X | | | | X | | | X | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | | | | X | | | | | | X | | | | | X | X | X | | | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | | | X | X | | | | X | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 19 |  |  |  |  |  | X | | X | | | | X | X | | | | X | | | | | | X | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 11 |  |  |  |  |  | X | | X | | | | X | X | | | | | | | | | | X | X | | | X | | | | | X | X | | | | | | | | | | | | | | | | X | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | | | | X | X | | | | X | | X | | | X | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 10 | 8 |  |  |  |  |  | X | | X | | | | X | | | | X | | | X | | | | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_mnli_reflex_number | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 20356
num_examples: 92
- name: dev_mismatched
num_bytes: 27643
num_examples: 112
- name: test_matched
num_bytes: 22153
num_examples: 88
- name: test_mismatched
num_bytes: 23549
num_examples: 102
- name: train
num_bytes: 1047688
num_examples: 4272
download_size: 636372
dataset_size: 1141389
---
# Dataset Card for "MULTI_VALUE_mnli_reflex_number"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ | ---
pretty_name: Evaluation run of TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ](https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T10:53:17.967443](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ/blob/main/results_2023-10-22T10-53-17.967443.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004823825503355705,\n\
\ \"em_stderr\": 0.0007095539645563046,\n \"f1\": 0.08351929530201369,\n\
\ \"f1_stderr\": 0.0017605531187545353,\n \"acc\": 0.4477247418430049,\n\
\ \"acc_stderr\": 0.010448120593026917\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004823825503355705,\n \"em_stderr\": 0.0007095539645563046,\n\
\ \"f1\": 0.08351929530201369,\n \"f1_stderr\": 0.0017605531187545353\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1243366186504928,\n \
\ \"acc_stderr\": 0.009088880962028442\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025395\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|arc:challenge|25_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T10_53_17.967443
path:
- '**/details_harness|drop|3_2023-10-22T10-53-17.967443.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T10-53-17.967443.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T10_53_17.967443
path:
- '**/details_harness|gsm8k|5_2023-10-22T10-53-17.967443.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T10-53-17.967443.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hellaswag|10_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T10_53_17.967443
path:
- '**/details_harness|winogrande|5_2023-10-22T10-53-17.967443.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T10-53-17.967443.parquet'
- config_name: results
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- results_2023-08-30T22:06:48.097340.parquet
- split: 2023_10_22T10_53_17.967443
path:
- results_2023-10-22T10-53-17.967443.parquet
- split: latest
path:
- results_2023-10-22T10-53-17.967443.parquet
---
# Dataset Card for Evaluation run of TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ](https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T10:53:17.967443](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ/blob/main/results_2023-10-22T10-53-17.967443.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004823825503355705,
"em_stderr": 0.0007095539645563046,
"f1": 0.08351929530201369,
"f1_stderr": 0.0017605531187545353,
"acc": 0.4477247418430049,
"acc_stderr": 0.010448120593026917
},
"harness|drop|3": {
"em": 0.004823825503355705,
"em_stderr": 0.0007095539645563046,
"f1": 0.08351929530201369,
"f1_stderr": 0.0017605531187545353
},
"harness|gsm8k|5": {
"acc": 0.1243366186504928,
"acc_stderr": 0.009088880962028442
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025395
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Aeala/ShareGPT_Vicuna_unfiltered | ---
license: apache-2.0
language:
- en
---
## Dataset Card
This is a reupload of [this dataset](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered) that was further cleaned by gozfarb. |
blastwind/github-code-haskell-file | ---
dataset_info:
features:
- name: code
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
- name: n_ast_errors
dtype: int64
- name: ast_max_depth
dtype: int64
- name: n_whitespaces
dtype: int64
- name: n_ast_nodes
dtype: int64
- name: n_ast_terminals
dtype: int64
- name: n_ast_nonterminals
dtype: int64
- name: loc
dtype: int64
- name: cycloplexity
dtype: int64
splits:
- name: train
num_bytes: 2024779946
num_examples: 339895
download_size: 805998536
dataset_size: 2024779946
task_categories:
- text-generation
tags:
- code
- haskell
size_categories:
- 100K<n<1M
---
# Dataset Card for "github-code-haskell-file"
Rows: 339k
Download Size: 806M
This dataset is extracted from [github-code-clean](https://huggingface.co/datasets/codeparrot/github-code-clean).
Each row also contains attribute values for my personal analysis project.
12.6% (43k) of the rows have cyclomatic complexity and LOC valued at `-1` because [`homplexity`](https://github.com/BlastWind/homplexity) failed in parsing the row's `uncommented_code`. |
joey234/mmlu-high_school_european_history-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 315227
num_examples: 165
download_size: 168253
dataset_size: 315227
---
# Dataset Card for "mmlu-high_school_european_history-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
showchen/jiaobaoyu | ---
license: apache-2.0
---
|
medric49/dolly-rag-pythia-410m | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: res:airedefined/pythia-410m-dolly-rag
dtype: string
splits:
- name: train
num_bytes: 6006042
num_examples: 3588
download_size: 3742391
dataset_size: 6006042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dpo-dolly-rag-pythia-410m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/75-baseline-dataset-llama | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
- name: raw_summary
dtype: string
splits:
- name: train
num_bytes: 129433124
num_examples: 15326
- name: test
num_bytes: 4635338
num_examples: 576
- name: validation
num_bytes: 4920271
num_examples: 576
download_size: 85078773
dataset_size: 138988733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
beyond-repair/dataset_smart_contract | ---
license: mit
---
|
open-llm-leaderboard/details_NExtNewChattingAI__shark_tank_ai_7_b | ---
pretty_name: Evaluation run of NExtNewChattingAI/shark_tank_ai_7_b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NExtNewChattingAI/shark_tank_ai_7_b](https://huggingface.co/NExtNewChattingAI/shark_tank_ai_7_b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NExtNewChattingAI__shark_tank_ai_7_b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T08:22:45.276136](https://huggingface.co/datasets/open-llm-leaderboard/details_NExtNewChattingAI__shark_tank_ai_7_b/blob/main/results_2023-12-18T08-22-45.276136.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65556501715169,\n\
\ \"acc_stderr\": 0.031845412531981275,\n \"acc_norm\": 0.6565753714833046,\n\
\ \"acc_norm_stderr\": 0.0324927894891608,\n \"mc1\": 0.4259485924112607,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6019434535176705,\n\
\ \"mc2_stderr\": 0.015061482204205485\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817836\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.678550089623581,\n\
\ \"acc_stderr\": 0.004660785616933751,\n \"acc_norm\": 0.8660625373431587,\n\
\ \"acc_norm_stderr\": 0.0033988905252297008\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586825,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586825\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372177,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\
\ \"acc_stderr\": 0.013140225515611724,\n \"acc_norm\": 0.8390804597701149,\n\
\ \"acc_norm_stderr\": 0.013140225515611724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.01662399851333311,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.01662399851333311\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6944444444444444,\n \"acc_stderr\": 0.01863559403442397,\n \
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.01863559403442397\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4259485924112607,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6019434535176705,\n\
\ \"mc2_stderr\": 0.015061482204205485\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613983\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6573161485974223,\n \
\ \"acc_stderr\": 0.01307303023082791\n }\n}\n```"
repo_url: https://huggingface.co/NExtNewChattingAI/shark_tank_ai_7_b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-22-45.276136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-22-45.276136.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- '**/details_harness|winogrande|5_2023-12-18T08-22-45.276136.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T08-22-45.276136.parquet'
- config_name: results
data_files:
- split: 2023_12_18T08_22_45.276136
path:
- results_2023-12-18T08-22-45.276136.parquet
- split: latest
path:
- results_2023-12-18T08-22-45.276136.parquet
---
# Dataset Card for Evaluation run of NExtNewChattingAI/shark_tank_ai_7_b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NExtNewChattingAI/shark_tank_ai_7_b](https://huggingface.co/NExtNewChattingAI/shark_tank_ai_7_b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NExtNewChattingAI__shark_tank_ai_7_b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T08:22:45.276136](https://huggingface.co/datasets/open-llm-leaderboard/details_NExtNewChattingAI__shark_tank_ai_7_b/blob/main/results_2023-12-18T08-22-45.276136.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65556501715169,
"acc_stderr": 0.031845412531981275,
"acc_norm": 0.6565753714833046,
"acc_norm_stderr": 0.0324927894891608,
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6019434535176705,
"mc2_stderr": 0.015061482204205485
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817836
},
"harness|hellaswag|10": {
"acc": 0.678550089623581,
"acc_stderr": 0.004660785616933751,
"acc_norm": 0.8660625373431587,
"acc_norm_stderr": 0.0033988905252297008
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586825,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372177,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611724,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.01662399851333311,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.01662399851333311
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.01863559403442397,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.01863559403442397
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6019434535176705,
"mc2_stderr": 0.015061482204205485
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613983
},
"harness|gsm8k|5": {
"acc": 0.6573161485974223,
"acc_stderr": 0.01307303023082791
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rocailler/test_prod_lab | ---
license: cc
dataset_info:
features:
- name: product_name
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 19643
num_examples: 99
download_size: 0
dataset_size: 19643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tj-solergibert/Europarl-ST-processed-mt-fr | ---
dataset_info:
features:
- name: source_text
dtype: string
- name: dest_text
dtype: string
- name: dest_lang
dtype:
class_label:
names:
'0': de
'1': en
'2': es
'3': fr
'4': it
'5': nl
'6': pl
'7': pt
'8': ro
splits:
- name: train
num_bytes: 199700180
num_examples: 560866
- name: valid
num_bytes: 27382683
num_examples: 74712
- name: test
num_bytes: 28363822
num_examples: 77906
download_size: 95095990
dataset_size: 255446685
---
# Dataset Card for "Europarl-ST-processed-mt-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheFinAI/fiqa-sentiment-classification | ---
language: en
license: mit
dataset_info:
features:
- name: _id
dtype: string
- name: sentence
dtype: string
- name: target
dtype: string
- name: aspect
dtype: string
- name: score
dtype: float64
- name: type
dtype: string
splits:
- name: train
num_bytes: 119567
num_examples: 822
- name: valid
num_bytes: 17184
num_examples: 117
- name: test
num_bytes: 33728
num_examples: 234
download_size: 102225
dataset_size: 170479
---
# Dataset Name
## Dataset Description
This dataset is based on the task 1 of the Financial Sentiment Analysis in the Wild (FiQA) challenge. It follows the same settings as described in the paper 'A Baseline for Aspect-Based Sentiment Analysis in Financial Microblogs and News'. The dataset is split into three subsets: train, valid, test with sizes 822, 117, 234 respectively.
## Dataset Structure
- `_id`: ID of the data point
- `sentence`: The sentence
- `target`: The target of the sentiment
- `aspect`: The aspect of the sentiment
- `score`: The sentiment score
- `type`: The type of the data point (headline or post)
## Additional Information
- Homepage: [FiQA Challenge](https://sites.google.com/view/fiqa/home)
- Citation: [A Baseline for Aspect-Based Sentiment Analysis in Financial Microblogs and News](https://arxiv.org/pdf/2211.00083.pdf)
## Downloading CSV
```python
from datasets import load_dataset
# Load the dataset from the hub
dataset = load_dataset("ChanceFocus/fiqa-sentiment-classification")
# Save the dataset to a CSV file
dataset["train"].to_csv("train.csv")
dataset["valid"].to_csv("valid.csv")
dataset["test"].to_csv("test.csv")
```
|
alanila/autotrain-data-mm | ---
task_categories:
- conditional-text-generation
---
# AutoTrain Dataset for project: mm
## Dataset Description
This dataset has been automatically processed by AutoTrain for project mm.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Email from attorney A Dutkanych regarding executed Settlement Agreement",
"target": "Email from attorney A Dutkanych regarding executed Settlement Agreement"
},
{
"text": "Telephone conference with A Royer regarding additional factual background information relating to O Stapletons Charge of Discrimination allegations",
"target": "Telephone conference with A Royer regarding additional factual background information as to O Stapletons Charge of Discrimination allegations"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 88 |
| valid | 22 |
|
open-llm-leaderboard/details_kevin009__babyllama-v0.6 | ---
pretty_name: Evaluation run of kevin009/babyllama-v0.6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kevin009/babyllama-v0.6](https://huggingface.co/kevin009/babyllama-v0.6) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__babyllama-v0.6\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T10:06:30.565512](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__babyllama-v0.6/blob/main/results_2024-02-13T10-06-30.565512.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26066347798834766,\n\
\ \"acc_stderr\": 0.030904794820091792,\n \"acc_norm\": 0.26161932329960463,\n\
\ \"acc_norm_stderr\": 0.0316608342460649,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.3584100057903431,\n\
\ \"mc2_stderr\": 0.013776314892170112\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.013960142600598677,\n\
\ \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.014034761386175458\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46335391356303524,\n\
\ \"acc_stderr\": 0.004976361454341339,\n \"acc_norm\": 0.6159131647082254,\n\
\ \"acc_norm_stderr\": 0.0048538457503921415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.03302789859901717,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.03302789859901717\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423088,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423088\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n\
\ \"acc_stderr\": 0.01607312785122125,\n \"acc_norm\": 0.280970625798212,\n\
\ \"acc_norm_stderr\": 0.01607312785122125\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n\
\ \"acc_stderr\": 0.014005843570897897,\n \"acc_norm\": 0.22681564245810057,\n\
\ \"acc_norm_stderr\": 0.014005843570897897\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667192,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667192\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.3584100057903431,\n\
\ \"mc2_stderr\": 0.013776314892170112\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890015\n }\n}\n```"
repo_url: https://huggingface.co/kevin009/babyllama-v0.6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|arc:challenge|25_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|gsm8k|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hellaswag|10_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T10-06-30.565512.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- '**/details_harness|winogrande|5_2024-02-13T10-06-30.565512.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T10-06-30.565512.parquet'
- config_name: results
data_files:
- split: 2024_02_13T10_06_30.565512
path:
- results_2024-02-13T10-06-30.565512.parquet
- split: latest
path:
- results_2024-02-13T10-06-30.565512.parquet
---
# Dataset Card for Evaluation run of kevin009/babyllama-v0.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/babyllama-v0.6](https://huggingface.co/kevin009/babyllama-v0.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__babyllama-v0.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T10:06:30.565512](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__babyllama-v0.6/blob/main/results_2024-02-13T10-06-30.565512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26066347798834766,
"acc_stderr": 0.030904794820091792,
"acc_norm": 0.26161932329960463,
"acc_norm_stderr": 0.0316608342460649,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.3584100057903431,
"mc2_stderr": 0.013776314892170112
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.013960142600598677,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.46335391356303524,
"acc_stderr": 0.004976361454341339,
"acc_norm": 0.6159131647082254,
"acc_norm_stderr": 0.0048538457503921415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.03302789859901717,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.03302789859901717
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677077,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677077
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423088,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423088
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863818,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863818
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.01607312785122125,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.01607312785122125
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897897,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897897
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.0108859297420022,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.0108859297420022
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667192,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667192
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.3584100057903431,
"mc2_stderr": 0.013776314892170112
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008463
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890015
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chan127ck/temp-dataset | ---
license: mit
---
|
dkshjn/MedpromptCoT | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 33862
num_examples: 100
download_size: 0
dataset_size: 33862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MedpromptCoT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-tex | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 186934
num_examples: 2906
- name: corpus
num_bytes: 86600423
num_examples: 68184
download_size: 43424126
dataset_size: 86787357
---
# Dataset Card for "cqudupstack-tex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quan246/MultiMed_News | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 189631
num_examples: 1000
- name: dev
num_bytes: 22160
num_examples: 100
- name: test
num_bytes: 628165
num_examples: 2352
download_size: 483746
dataset_size: 839956
---
# Dataset Card for "MultiMed_News"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tejagoud/sample2testforqa | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 3143730.0
num_examples: 7
download_size: 614598
dataset_size: 3143730.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
atgarcia/trainDataset3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: emg
sequence:
sequence: float64
splits:
- name: train
num_bytes: 809926972
num_examples: 548
download_size: 306448547
dataset_size: 809926972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hdeldar/Persian-Text-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1830325
num_examples: 1000
download_size: 1841325
dataset_size: 1830325
dataset_name: json
configs:
- config_name: default
data_files:
- split: train
path: data/data-*
---
# Persian-Text-QA: Lazy Llama 2 Formatting
This is a subset (1k samples) of the [`SeyedAli/Persian-Text-QA`](https://huggingface.co/datasets/SeyedAli/Persian-Text-QA) dataset, processed to match Llama 2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the following [colab notebook](https://colab.research.google.com/drive/1Ad7a9zMmkxuXTOh1Z7-rNSICA4dybpM2?usp=sharing).
Useful if you don't want to reformat it by yourself (e.g., using a script). It was designed for [this article](https://mlabonne.github.io/blog/posts/Fine_Tune_Your_Own_Llama_2_Model_in_a_Colab_Notebook.html) about fine-tuning a Llama 2 (chat) model in a Google Colab.
|
gaygaaa/LINKS | ---
license: mit
---
|
Rasi1610/Deathce502merged_series2_3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 172807217.0
num_examples: 317
- name: val
num_bytes: 43245725.0
num_examples: 80
download_size: 215970660
dataset_size: 216052942.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
cmu-lti/sotopia-pi | ---
license: cc-by-sa-4.0
---
For details on how the dataset is used, please refer to https://arxiv.org/pdf/2403.08715.
## inspirational_prompt.csv
This csv stores full inspirational prompts from three data sources - `social\_iqa`, `social\_chem`, `normbank`. The prompts are used to generate social tasks used in Sotopia-pi. Each inspirational prompt would generate an "Environment" object in Sotopia-pi that specifies the background of the social task. Each environment would be combined with different agent profiles and relationship to generate a comprehensive
Compared to Sotopia's inspirational prompts that include cherry-pick a few examples from 6 datasets (`social\_iqa`, `social\_chem`, `normbank`, `deal-or-no-deal`, `persuation_for_good`, `mindcraft`), we does not include `deal-or-no-deal` and `mindcraft` because we think those inspirational prompt is too similar within one dataset and would cause some leakage if we train on them and test on sotopia ones. We also exclude `persuation_for_good` because we cannnot find the exact form of inspirational prompt that is the same with sotopia's inspirational prompt and the previous mentioed three datasets already provide enough inspirational prompts.
## used_prompt.csv
This csv stores all used inspirational prompts, their source dataset, and the corresponding environment object id each prompt generated. To refer to the detail content of the environment by prompt, simply use this csv and find the "pk" of the inspirational prompt.
## experiment_episodes.json
This json file store the detailed information for all sotopia-pi conversations. Each conversation is a dictionary with:
1. epsiode_id: the unique id of the conversation
2. scenario: the social environment of which the conversation happens under
3. codename: type of the scenario
4. agents_background: a set of two agent's social profile, including age, secret, personality, etc
5. social_goals: a set of two agent's social goal that each aims to achieve in the conversation
6. social_interactions: a list of turn-based conversations between two agents
|
enryu43/twitter100m_tweets | ---
dataset_info:
features:
- name: user
dtype: string
- name: id
dtype: int64
- name: tweet
dtype: string
- name: replies
dtype: int64
- name: retweets
dtype: int64
- name: likes
dtype: int64
- name: quotes
dtype: int64
- name: date
dtype: string
splits:
- name: train
num_bytes: 20356236942
num_examples: 88084332
download_size: 9614694227
dataset_size: 20356236942
---
# Dataset Card for "twitter100m_tweets"
Dataset with tweets for [this post](https://medium.com/@enryu9000/fun-with-large-scale-tweet-analysis-783c96b45df4). |
CyberHarem/a2_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of a2/A2/A2/A2 (Nikke: Goddess of Victory)
This is the dataset of a2/A2/A2/A2 (Nikke: Goddess of Victory), containing 476 images and their tags.
The core tags of this character are `long_hair, breasts, blue_eyes, mole, mole_under_mouth, white_hair, hair_between_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 476 | 679.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a2_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 476 | 387.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a2_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1038 | 742.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a2_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 476 | 604.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a2_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1038 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/a2_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a2_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, android, armlet, bare_shoulders, black_gloves, elbow_gloves, robot_joints, solo, tank_top, black_shorts, short_shorts, looking_at_viewer, black_thighhighs |
| 1 | 35 |  |  |  |  |  | 1girl, android, black_gloves, elbow_gloves, robot_joints, solo, holding_sword, bare_shoulders, black_shorts, short_shorts, tank_top, looking_at_viewer, armlet, black_thighhighs |
| 2 | 7 |  |  |  |  |  | 1girl, android, bare_shoulders, black_gloves, black_shorts, elbow_gloves, holding_sword, looking_at_viewer, robot_joints, short_shorts, solo, tank_top, armlet, closed_mouth, collarbone, pink_lips, standing, very_long_hair, black_thighhighs, cowboy_shot, bangs, katana, grey_eyes |
| 3 | 7 |  |  |  |  |  | 1girl, android, bare_shoulders, black_gloves, elbow_gloves, looking_at_viewer, robot_joints, simple_background, solo, tank_top, upper_body, white_background, armlet, parted_lips |
| 4 | 6 |  |  |  |  |  | 1girl, android, bare_shoulders, black_gloves, collarbone, elbow_gloves, looking_at_viewer, robot_joints, solo, upper_body, closed_mouth, armlet, pink_lips, black_tank_top |
| 5 | 5 |  |  |  |  |  | 1girl, android, ass, bare_shoulders, black_gloves, black_shorts, black_thighhighs, elbow_gloves, from_behind, robot_joints, short_shorts, solo, high_heels, holding_sword, looking_back, standing, full_body, looking_at_viewer, thighs |
| 6 | 6 |  |  |  |  |  | 1girl, black_dress, black_hairband, black_thighhighs, cleavage_cutout, katana, black_gloves, feather-trimmed_sleeves, holding_sword, juliet_sleeves, looking_at_viewer, short_hair, solo, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | android | armlet | bare_shoulders | black_gloves | elbow_gloves | robot_joints | solo | tank_top | black_shorts | short_shorts | looking_at_viewer | black_thighhighs | holding_sword | closed_mouth | collarbone | pink_lips | standing | very_long_hair | cowboy_shot | bangs | katana | grey_eyes | simple_background | upper_body | white_background | parted_lips | black_tank_top | ass | from_behind | high_heels | looking_back | full_body | thighs | black_dress | black_hairband | cleavage_cutout | feather-trimmed_sleeves | juliet_sleeves | short_hair | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:---------|:-----------------|:---------------|:---------------|:---------------|:-------|:-----------|:---------------|:---------------|:--------------------|:-------------------|:----------------|:---------------|:-------------|:------------|:-----------|:-----------------|:--------------|:--------|:---------|:------------|:--------------------|:-------------|:-------------------|:--------------|:-----------------|:------|:--------------|:-------------|:---------------|:------------|:---------|:--------------|:-----------------|:------------------|:--------------------------|:-----------------|:-------------|:--------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | X | | | X | X | X | | | | | | | | X | | | X | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | X | X | X | X | | X | X | X | X | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
stoddur/med_chat_7 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 174024240.0
num_examples: 112710
download_size: 4105947
dataset_size: 174024240.0
---
# Dataset Card for "med_chat_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_macadeliccc__Laser-WestLake-2x7b | ---
pretty_name: Evaluation run of macadeliccc/Laser-WestLake-2x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/Laser-WestLake-2x7b](https://huggingface.co/macadeliccc/Laser-WestLake-2x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__Laser-WestLake-2x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T21:37:13.080453](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Laser-WestLake-2x7b/blob/main/results_2024-01-27T21-37-13.080453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6524530392737511,\n\
\ \"acc_stderr\": 0.032031171056502564,\n \"acc_norm\": 0.6524349886456735,\n\
\ \"acc_norm_stderr\": 0.03269889548892183,\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6924615568479368,\n\
\ \"mc2_stderr\": 0.015144126921968178\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7192790280820553,\n\
\ \"acc_stderr\": 0.004484330827465553,\n \"acc_norm\": 0.8843855805616411,\n\
\ \"acc_norm_stderr\": 0.0031910847927931548\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"\
acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948475,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525818,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6924615568479368,\n\
\ \"mc2_stderr\": 0.015144126921968178\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8579321231254933,\n \"acc_stderr\": 0.009812000391679364\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \
\ \"acc_stderr\": 0.013258428375662247\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/Laser-WestLake-2x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|arc:challenge|25_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|gsm8k|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hellaswag|10_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T21-37-13.080453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T21-37-13.080453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- '**/details_harness|winogrande|5_2024-01-27T21-37-13.080453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T21-37-13.080453.parquet'
- config_name: results
data_files:
- split: 2024_01_27T21_37_13.080453
path:
- results_2024-01-27T21-37-13.080453.parquet
- split: latest
path:
- results_2024-01-27T21-37-13.080453.parquet
---
# Dataset Card for Evaluation run of macadeliccc/Laser-WestLake-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/Laser-WestLake-2x7b](https://huggingface.co/macadeliccc/Laser-WestLake-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__Laser-WestLake-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T21:37:13.080453](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Laser-WestLake-2x7b/blob/main/results_2024-01-27T21-37-13.080453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6524530392737511,
"acc_stderr": 0.032031171056502564,
"acc_norm": 0.6524349886456735,
"acc_norm_stderr": 0.03269889548892183,
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6924615568479368,
"mc2_stderr": 0.015144126921968178
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7192790280820553,
"acc_stderr": 0.004484330827465553,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.0031910847927931548
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948475,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525818,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6924615568479368,
"mc2_stderr": 0.015144126921968178
},
"harness|winogrande|5": {
"acc": 0.8579321231254933,
"acc_stderr": 0.009812000391679364
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662247
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
abideen/lex-0.1 | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 824214090
num_examples: 378246
download_size: 435250193
dataset_size: 824214090
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes | ---
pretty_name: Evaluation run of linlinlin/zephy_SFT_Hermes
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [linlinlin/zephy_SFT_Hermes](https://huggingface.co/linlinlin/zephy_SFT_Hermes)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:07:56.840567](https://huggingface.co/datasets/open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes/blob/main/results_2024-03-11T19-07-56.840567.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6347543492968393,\n\
\ \"acc_stderr\": 0.03219682236802622,\n \"acc_norm\": 0.6408300570930421,\n\
\ \"acc_norm_stderr\": 0.032843811369361584,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.42171719347199227,\n\
\ \"mc2_stderr\": 0.014135019592668293\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182528,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.004820166002253078,\n \"acc_norm\": 0.833698466440948,\n\
\ \"acc_norm_stderr\": 0.0037159010850549893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612924,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612924\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077823,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172552,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172552\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.42171719347199227,\n\
\ \"mc2_stderr\": 0.014135019592668293\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3707354056103108,\n \
\ \"acc_stderr\": 0.013304267705458431\n }\n}\n```"
repo_url: https://huggingface.co/linlinlin/zephy_SFT_Hermes
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-07-56.840567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-07-56.840567.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- '**/details_harness|winogrande|5_2024-03-11T19-07-56.840567.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-07-56.840567.parquet'
- config_name: results
data_files:
- split: 2024_03_11T19_07_56.840567
path:
- results_2024-03-11T19-07-56.840567.parquet
- split: latest
path:
- results_2024-03-11T19-07-56.840567.parquet
---
# Dataset Card for Evaluation run of linlinlin/zephy_SFT_Hermes
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [linlinlin/zephy_SFT_Hermes](https://huggingface.co/linlinlin/zephy_SFT_Hermes) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:07:56.840567](https://huggingface.co/datasets/open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes/blob/main/results_2024-03-11T19-07-56.840567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6347543492968393,
"acc_stderr": 0.03219682236802622,
"acc_norm": 0.6408300570930421,
"acc_norm_stderr": 0.032843811369361584,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.42171719347199227,
"mc2_stderr": 0.014135019592668293
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182528,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180644
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253078,
"acc_norm": 0.833698466440948,
"acc_norm_stderr": 0.0037159010850549893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612924,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612924
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077823,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172552,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172552
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.42171719347199227,
"mc2_stderr": 0.014135019592668293
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.3707354056103108,
"acc_stderr": 0.013304267705458431
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
justinsiow/fathomnet-test | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int64
- name: width
dtype: int64
- name: height
dtype: int64
splits:
- name: test
num_bytes: 22257603152.424
num_examples: 10744
download_size: 22919610639
dataset_size: 22257603152.424
---
# Dataset Card for "fathomnet-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kamado_nezuko_demonslayer | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamado_nezuko (Kimetsu no Yaiba)
This is the dataset of kamado_nezuko (Kimetsu no Yaiba), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
KentoTsu/EliseRato | ---
license: openrail
---
|
tarudesu/ViHealthQA | ---
task_categories:
- question-answering
language:
- vi
tags:
- medical
pretty_name: Vietnamese Healthcare Question Answering Dataset
size_categories:
- 10K<n<100K
---
## Disclaimer:
The dataset may contain personal information crawled along with the contents of various sources. Please make a filter in pre-processing data before starting your research training.
# SPBERTQA: A Two-Stage Question Answering System Based on Sentence Transformers for Medical Texts
This is the official repository for the ViHealthQA dataset from the paper [SPBERTQA: A Two-Stage Question Answering System Based on Sentence Transformers for Medical Texts](https://arxiv.org/pdf/2206.09600.pdf), which was accepted at the [KSEM-2022](https://ksem22.smart-conf.net/index.html).
# Citation Information
The provided dataset is only used for research purposes!
```
@InProceedings{nguyen2022viheathqa,
author="Nguyen, Nhung Thi-Hong
and Ha, Phuong Phan-Dieu
and Nguyen, Luan Thanh
and Van Nguyen, Kiet
and Nguyen, Ngan Luu-Thuy",
title="SPBERTQA: A Two-Stage Question Answering System Based on Sentence Transformers for Medical Texts",
booktitle="Knowledge Science, Engineering and Management",
year="2022",
publisher="Springer International Publishing",
address="Cham",
pages="371--382",
isbn="978-3-031-10986-7"
}
```
# Abstract
Question answering (QA) systems have gained explosive attention in recent years. However, QA tasks in Vietnamese do not have many datasets. Significantly, there is mostly no dataset in the medical domain. Therefore, we built a Vietnamese Healthcare Question Answering dataset (ViHealthQA), including 10,015 question-answer passage pairs for this task, in which questions from health-interested users were asked on prestigious health websites and answers from highly qualified experts. This paper proposes a two-stage QA system based on Sentence-BERT (SBERT) using multiple negatives ranking (MNR) loss combined with BM25. Then, we conduct diverse experiments with many bag-of-words models to assess our system’s performance. With the obtained results, this system achieves better performance than traditional methods.
# Dataset
The ViHealthQA dataset is consist of 10,015 question-answer passage pairs. Note that questions are from health-interested users asked on prestigious health websites and answers are from highly qualified experts.
The dataset is divided into three parts as below:
1. Train set: 7.01K question-answer pairs
2. Valid set: 2.01 question-answer pairs
3. Test set: 993 question-answer pairs
# Contact
Please feel free to contact us by email luannt@uit.edu.vn if you have any further information! |
dim/ru_turbo_alpaca_evol_instruct_3k | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: iteration
dtype: int64
splits:
- name: train
num_bytes: 6677510
num_examples: 3000
download_size: 3214805
dataset_size: 6677510
---
|
argilla/databricks-dolly-15k-curated-multilingual | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: instruction_original_en
dtype: string
- name: context_original_en
dtype: string
- name: response_original_en
dtype: string
- name: id
dtype: int64
splits:
- name: de
num_bytes: 25985140
num_examples: 15015
- name: en
num_bytes: 24125109
num_examples: 15015
- name: es
num_bytes: 25902709
num_examples: 15015
- name: fr
num_bytes: 26704314
num_examples: 15015
download_size: 65586669
dataset_size: 102717272
license: cc-by-sa-3.0
task_categories:
- text-generation
- text2text-generation
language:
- es
- de
- fr
tags:
- machine-translated
- instruction-following
pretty_name: Databrick Dolly Instructions Multilingual
size_categories:
- 10K<n<100K
---
# Dataset Card for "databricks-dolly-15k-curated-multilingual"
A curated and multilingual version of the Databricks Dolly instructions dataset. It includes a programmatically and manually corrected version of the original `en` dataset. See below.
**STATUS**:
Currently, the original Dolly v2 English version has been curated combining automatic processing and collaborative human curation using Argilla (~400 records have been manually edited and fixed). The following graph shows a summary about the number of edited fields.

## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://huggingface.co/datasets/argilla/databricks-dolly-15k-multilingual/**
- **Repository: https://huggingface.co/datasets/argilla/databricks-dolly-15k-multilingual/**
- **Paper:**
- **Leaderboard:**
- **Point of Contact: contact@argilla.io, https://github.com/argilla-io/argilla**
### Dataset Summary
This dataset collection is a curated and machine-translated version of the `databricks-dolly-15k` [dataset](https://github.com/databrickslabs/dolly/tree/master/data) originally created by Databricks, Inc. in 2023.
The goal is to give practitioners a starting point for training open-source instruction-following models with better-quality English data and translated data beyond English. However, as the translation quality will not be perfect, we highly recommend dedicating time to curate and fix translation issues. Below we explain how to load the datasets into [Argilla for data curation and fixing](https://github.com/argilla-io/argilla). Additionally, we'll be improving the datasets made available here, with the help of different communities.
Currently, the original English version has been curated combining automatic processing and collaborative human curation using Argilla (~400 records have been manually edited and fixed). The following graph shows a summary of the number of edited fields.
The main issues (likely many issues still remaining) are the following:
1. Some labelers misunderstood the usage of the `context` field. This `context` field is used as part of the prompt for instruction-tuning and in other works it's called `input` (e.g., Alpaca). Likely, the name context, has led to some labelers using it to provide the full context of where they have extracted the response. This is problematic for some types of tasks (summarization, closed-qa or information-extraction) because sometimes the context is shorter than or unrelated to summaries, or the information cannot be extracted from the context (closed-qa, information-extraction).
2. Some labelers misunderstood the way to give instructions for summarization or closed-qa, for example, they ask: Who is Thomas Jefferson? then provide a very long context and a response equally long.
We programmatically identified records with these potential issues and ran a campaign to fix it and as a result more than 400 records have been adapted. See below for statistics:

As a result of this curation process the content of the fields has been reduced, counted in number of tokens, especially for the responses:

If you want to browse and curate your dataset with Argilla, you can:
1. [Duplicate this Space](https://huggingface.co/spaces/argilla/dolly-multilingual-curation/settings?duplicate=true). IMPORTANT: The Space's Visibility need to be Public, but you can setup your own password and API KEYS [following this guide](https://docs.argilla.io/en/latest/getting_started/installation/deployments/huggingface-spaces.html#setting-up-secret-environment-variables).
2. Setup two secrets: `HF_TOKEN` and `LANG` for indicating the language split
3. Login with `admin`/`12345678` and start browsing and labelling.
4. Start labeling. Every 5 min the validations will be stored on a Hub dataset in your personal HF space.
5. Please get in touch to contribute fixes and improvements to the source datasets.
There's one split per language:
```python
from datasets import load_dataset
# loads all splits
load_dataset("argilla/databricks-dolly-15k-curate-multilingual")
# loads Spanish splits
load_dataset("argilla/databricks-dolly-15k-curated-multilingual", split="es")
```
### Supported Tasks and Leaderboards
As described in the README of the original dataset, this dataset can be used for:
* Training LLMs
* Synthetic Data Generation
* Data Augmentation
### Languages
Currently: `es`, `fr`, `de`, `en`
Join Argilla [Slack community](https://join.slack.com/t/rubrixworkspace/shared_invite/zt-whigkyjn-a3IUJLD7gDbTZ0rKlvcJ5g) if you want to help us include other languages.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
There's one split per language:
```python
from datasets import load_dataset
# loads all splits
load_dataset("argilla/databricks-dolly-15k-multilingual")
# loads Spanish splits
load_dataset("argilla/databricks-dolly-15k-multilingual", split="es")
```
## Dataset Creation
These datasets have been translated using the DeepL API from the original English dataset between the 13th and 14th of April
### Curation Logbook
* 28/04/23: Removed references from Wikipedia copy pastes for 8113 rows. Applied to context and response fields with the following regex: `r'\[[\w]+\]'`
### Source Data
#### Initial Data Collection and Normalization
Refer to the [original dataset](https://github.com/databrickslabs/dolly/tree/master/data) for more information.
#### Who are the source language producers?
[More Information Needed]
### Annotations
Annotations are planned but not performed yet.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
**Original dataset Owner: Databricks, Inc.**
### Citation Information
[More Information Needed] |
NathanRoll/TalkBank_CA_CABNC_5 | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 17917023985
num_examples: 99
download_size: 17934314304
dataset_size: 17917023985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TalkBank_CA_CABNC_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ongknsro/nekos-nonnekos | ---
task_categories:
- image-classification
pretty_name: neko or non_neko
size_categories:
- 1K<n<10K
---
A dataset of SD-generated images of anime neko characters vs. non_neko characters. |
open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0 | ---
pretty_name: Evaluation run of LLMs/WizardLM-30B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LLMs/WizardLM-30B-V1.0](https://huggingface.co/LLMs/WizardLM-30B-V1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-30T02:37:12.561310](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0/blob/main/results_2023-10-30T02-37-12.561310.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.286493288590604,\n\
\ \"em_stderr\": 0.0046301590416793666,\n \"f1\": 0.3621350671140946,\n\
\ \"f1_stderr\": 0.00452241220066869,\n \"acc\": 0.4967032138503913,\n\
\ \"acc_stderr\": 0.011557270415432395\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.286493288590604,\n \"em_stderr\": 0.0046301590416793666,\n\
\ \"f1\": 0.3621350671140946,\n \"f1_stderr\": 0.00452241220066869\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21834723275208492,\n \
\ \"acc_stderr\": 0.011379497266738049\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126742\n\
\ }\n}\n```"
repo_url: https://huggingface.co/LLMs/WizardLM-30B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T07_27_45.879930
path:
- '**/details_harness|drop|3_2023-10-17T07-27-45.879930.parquet'
- split: 2023_10_30T02_37_12.561310
path:
- '**/details_harness|drop|3_2023-10-30T02-37-12.561310.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-30T02-37-12.561310.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T07_27_45.879930
path:
- '**/details_harness|gsm8k|5_2023-10-17T07-27-45.879930.parquet'
- split: 2023_10_30T02_37_12.561310
path:
- '**/details_harness|gsm8k|5_2023-10-30T02-37-12.561310.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-30T02-37-12.561310.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_36_33.189763
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:36:33.189763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:36:33.189763.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T07_27_45.879930
path:
- '**/details_harness|winogrande|5_2023-10-17T07-27-45.879930.parquet'
- split: 2023_10_30T02_37_12.561310
path:
- '**/details_harness|winogrande|5_2023-10-30T02-37-12.561310.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-30T02-37-12.561310.parquet'
- config_name: results
data_files:
- split: 2023_10_17T07_27_45.879930
path:
- results_2023-10-17T07-27-45.879930.parquet
- split: 2023_10_30T02_37_12.561310
path:
- results_2023-10-30T02-37-12.561310.parquet
- split: latest
path:
- results_2023-10-30T02-37-12.561310.parquet
---
# Dataset Card for Evaluation run of LLMs/WizardLM-30B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LLMs/WizardLM-30B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LLMs/WizardLM-30B-V1.0](https://huggingface.co/LLMs/WizardLM-30B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-30T02:37:12.561310](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-30B-V1.0/blob/main/results_2023-10-30T02-37-12.561310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.286493288590604,
"em_stderr": 0.0046301590416793666,
"f1": 0.3621350671140946,
"f1_stderr": 0.00452241220066869,
"acc": 0.4967032138503913,
"acc_stderr": 0.011557270415432395
},
"harness|drop|3": {
"em": 0.286493288590604,
"em_stderr": 0.0046301590416793666,
"f1": 0.3621350671140946,
"f1_stderr": 0.00452241220066869
},
"harness|gsm8k|5": {
"acc": 0.21834723275208492,
"acc_stderr": 0.011379497266738049
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Zexuan/soc_data | ---
license: apache-2.0
task_categories:
- text-classification
tags:
- finance
pretty_name: job posting occupation classification
size_categories:
- 10M<n<100M
--- |
boborr/twil | ---
license: openrail
---
|
open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca | ---
pretty_name: Evaluation run of hongzoh/Yi-6B_Open-Orca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hongzoh/Yi-6B_Open-Orca](https://huggingface.co/hongzoh/Yi-6B_Open-Orca) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T16:23:51.884120](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca/blob/main/results_2024-03-30T16-23-51.884120.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5717858475028725,\n\
\ \"acc_stderr\": 0.033155143858099624,\n \"acc_norm\": 0.5813524540566153,\n\
\ \"acc_norm_stderr\": 0.033897650897487565,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.38629872708898366,\n\
\ \"mc2_stderr\": 0.013894111895665723\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45819112627986347,\n \"acc_stderr\": 0.014560220308714695,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4929296952798247,\n\
\ \"acc_stderr\": 0.004989282516055395,\n \"acc_norm\": 0.6959768970324637,\n\
\ \"acc_norm_stderr\": 0.0045905235720580155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817227,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817227\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016022,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016022\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864907,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864907\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531772,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531772\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7177522349936143,\n\
\ \"acc_stderr\": 0.01609530296987855,\n \"acc_norm\": 0.7177522349936143,\n\
\ \"acc_norm_stderr\": 0.01609530296987855\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968823,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968823\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004913,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004913\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291488,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291488\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983969,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983969\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.38629872708898366,\n\
\ \"mc2_stderr\": 0.013894111895665723\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7040252565114443,\n \"acc_stderr\": 0.012829348226339006\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \
\ \"acc_stderr\": 0.009321265253857517\n }\n}\n```"
repo_url: https://huggingface.co/hongzoh/Yi-6B_Open-Orca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-24-31.081696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-23-51.884120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-23-51.884120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- '**/details_harness|winogrande|5_2024-03-30T15-24-31.081696.parquet'
- split: 2024_03_30T16_23_51.884120
path:
- '**/details_harness|winogrande|5_2024-03-30T16-23-51.884120.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T16-23-51.884120.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_24_31.081696
path:
- results_2024-03-30T15-24-31.081696.parquet
- split: 2024_03_30T16_23_51.884120
path:
- results_2024-03-30T16-23-51.884120.parquet
- split: latest
path:
- results_2024-03-30T16-23-51.884120.parquet
---
# Dataset Card for Evaluation run of hongzoh/Yi-6B_Open-Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hongzoh/Yi-6B_Open-Orca](https://huggingface.co/hongzoh/Yi-6B_Open-Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T16:23:51.884120](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca/blob/main/results_2024-03-30T16-23-51.884120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5717858475028725,
"acc_stderr": 0.033155143858099624,
"acc_norm": 0.5813524540566153,
"acc_norm_stderr": 0.033897650897487565,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.38629872708898366,
"mc2_stderr": 0.013894111895665723
},
"harness|arc:challenge|25": {
"acc": 0.45819112627986347,
"acc_stderr": 0.014560220308714695,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.4929296952798247,
"acc_stderr": 0.004989282516055395,
"acc_norm": 0.6959768970324637,
"acc_norm_stderr": 0.0045905235720580155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180276,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180276
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817227,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817227
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016022,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016022
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864907,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864907
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7177522349936143,
"acc_stderr": 0.01609530296987855,
"acc_norm": 0.7177522349936143,
"acc_norm_stderr": 0.01609530296987855
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968823,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968823
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004913,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004913
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291488,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291488
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983969,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.38629872708898366,
"mc2_stderr": 0.013894111895665723
},
"harness|winogrande|5": {
"acc": 0.7040252565114443,
"acc_stderr": 0.012829348226339006
},
"harness|gsm8k|5": {
"acc": 0.13191811978771797,
"acc_stderr": 0.009321265253857517
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Ammar-Azman/shinjiru-blog | ---
license: mit
---
|
CyberHarem/anzu_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of anzu/アンズ (Pokémon)
This is the dataset of anzu/アンズ (Pokémon), containing 82 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 82 | 52.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anzu_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 82 | 37.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anzu_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 138 | 62.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anzu_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 82 | 48.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anzu_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 138 | 77.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anzu_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anzu_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, ninja, closed_mouth, looking_at_viewer, solo, purple_scarf, smile |
| 1 | 7 |  |  |  |  |  | 1girl, ninja, scarf, fishnets, ponytail, black_hair, japanese_clothes, pokemon_(creature) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | ninja | closed_mouth | looking_at_viewer | solo | purple_scarf | smile | scarf | fishnets | ponytail | black_hair | japanese_clothes | pokemon_(creature) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:---------------|:--------------------|:-------|:---------------|:--------|:--------|:-----------|:-----------|:-------------|:-------------------|:---------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | | | | | X | X | X | X | X | X |
|
carnival13/massive_val_DA2_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 16518290
num_examples: 24160
download_size: 3770585
dataset_size: 16518290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "massive_val_DA2_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fredithefish/GPTeacher-for-RedPajama-Chat | ---
license: apache-2.0
---
|
dvinagre/euskera-speaker-embeddings | ---
dataset_info:
features:
- name: speaker_embeddings
sequence: float64
splits:
- name: train
num_bytes: 40286600
num_examples: 9826
download_size: 33659727
dataset_size: 40286600
---
# Dataset Card for "euskera-speaker-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pszemraj/fleece2instructions | ---
license: cc-by-4.0
source_datasets: tatsu-lab/alpaca
task_categories:
- text-generation
- text2text-generation
language:
- en
tags:
- alpaca
- instruction generation
size_categories:
- 10K<n<100K
---
#
The [`tatsu-lab/alpaca` dataset](https://huggingface.co/datasets/tatsu-lab/alpaca) was split into train/test/val with the goal of training text-to-text generation models to generate instruction prompts corresponding to arbitrary text.
To do this, you would use
- `output` as **the text2text model** input column
- `instruction` as the text2text model target/output column
## modifications & filtering
Rows that used the column `input` in the original dataset, and rows where the `output` column contains less than 8 words were dropped.
Link to [function used to filter](https://gist.github.com/pszemraj/3633acb0cf3288d49b7bee550e756839) the original dataset after splitting
- The filter_dataset function reads datasets, counts tokens in specified columns, filters rows based on a minimum number of tokens, drops specified columns and/or rows with non-NaN values, and saves the modified datasets to a new directory. It returns summary statistics of the modified records.
## dataset info
Output of loading the dataset:
```python
DatasetDict({
train: Dataset({
features: ['instruction', 'output'],
num_rows: 23167
})
test: Dataset({
features: ['instruction', 'output'],
num_rows: 2822
})
validation: Dataset({
features: ['instruction', 'output'],
num_rows: 2866
})
})
```
## token counts in the `output` column
t5

bart-base

--- |
kayteekay/jordan-generator-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 605426023.284
num_examples: 2317
download_size: 509156922
dataset_size: 605426023.284
---
# Dataset Card for "jordan-generator-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AppleHarem/vanilla_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vanilla (Arknights)
This is the dataset of vanilla (Arknights), containing 30 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 30 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 65 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 68 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 30 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 30 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 30 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 65 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 65 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 31 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 68 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 68 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
ganader-ia-developers/cowdb | ---
dataset_info:
- config_name: default
features:
- name: image
dtype: image
- name: cow_id
dtype: int64
- name: weight
dtype: float64
- name: source
dtype: string
- name: breed
dtype: string
- name: sex
dtype: string
- name: orientation
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 2999177241.0
num_examples: 876
download_size: 2991378426
dataset_size: 2999177241.0
- config_name: resized
features:
- name: image
dtype: image
- name: cow_id
dtype: int64
- name: weight
dtype: float64
- name: source
dtype: string
- name: breed
dtype: string
- name: sex
dtype: string
- name: orientation
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 871009194.0
num_examples: 876
download_size: 871019789
dataset_size: 871009194.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: resized
data_files:
- split: train
path: resized/train-*
---
|
open-llm-leaderboard/details_ashercn97__giraffe-7b | ---
pretty_name: Evaluation run of ashercn97/giraffe-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ashercn97/giraffe-7b](https://huggingface.co/ashercn97/giraffe-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ashercn97__giraffe-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:53:47.065964](https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__giraffe-7b/blob/main/results_2023-09-22T20-53-47.065964.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00388003355704698,\n\
\ \"em_stderr\": 0.0006366682825520032,\n \"f1\": 0.06388317953020159,\n\
\ \"f1_stderr\": 0.0014760537495948263,\n \"acc\": 0.3581768614021409,\n\
\ \"acc_stderr\": 0.008713750066062537\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825520032,\n\
\ \"f1\": 0.06388317953020159,\n \"f1_stderr\": 0.0014760537495948263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \
\ \"acc_stderr\": 0.004427045987265172\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6898184688239937,\n \"acc_stderr\": 0.013000454144859902\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ashercn97/giraffe-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|arc:challenge|25_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|drop|3_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-53-47.065964.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-53-47.065964.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hellaswag|10_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T15:44:19.746565.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T15:44:19.746565.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_53_47.065964
path:
- '**/details_harness|winogrande|5_2023-09-22T20-53-47.065964.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-53-47.065964.parquet'
- config_name: results
data_files:
- split: 2023_08_02T15_44_19.746565
path:
- results_2023-08-02T15:44:19.746565.parquet
- split: 2023_09_22T20_53_47.065964
path:
- results_2023-09-22T20-53-47.065964.parquet
- split: latest
path:
- results_2023-09-22T20-53-47.065964.parquet
---
# Dataset Card for Evaluation run of ashercn97/giraffe-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ashercn97/giraffe-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ashercn97/giraffe-7b](https://huggingface.co/ashercn97/giraffe-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ashercn97__giraffe-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:53:47.065964](https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__giraffe-7b/blob/main/results_2023-09-22T20-53-47.065964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825520032,
"f1": 0.06388317953020159,
"f1_stderr": 0.0014760537495948263,
"acc": 0.3581768614021409,
"acc_stderr": 0.008713750066062537
},
"harness|drop|3": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825520032,
"f1": 0.06388317953020159,
"f1_stderr": 0.0014760537495948263
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.004427045987265172
},
"harness|winogrande|5": {
"acc": 0.6898184688239937,
"acc_stderr": 0.013000454144859902
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v5 | ---
pretty_name: Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/MixtureofMerges-MoE-4x7b-v5](https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T04:07:58.931568](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v5/blob/main/results_2024-03-01T04-07-58.931568.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653432343566328,\n\
\ \"acc_stderr\": 0.03198522504293209,\n \"acc_norm\": 0.6525554225781037,\n\
\ \"acc_norm_stderr\": 0.03265700882712598,\n \"mc1\": 0.594859241126071,\n\
\ \"mc1_stderr\": 0.01718561172775338,\n \"mc2\": 0.737297987613419,\n\
\ \"mc2_stderr\": 0.014526936687711855\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7175767918088737,\n \"acc_stderr\": 0.013155456884097224,\n\
\ \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.01283552390947384\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7215694084843657,\n\
\ \"acc_stderr\": 0.004473104537026919,\n \"acc_norm\": 0.8899621589324835,\n\
\ \"acc_norm_stderr\": 0.003122973632039472\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.594859241126071,\n\
\ \"mc1_stderr\": 0.01718561172775338,\n \"mc2\": 0.737297987613419,\n\
\ \"mc2_stderr\": 0.014526936687711855\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627295\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186141\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|arc:challenge|25_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|gsm8k|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hellaswag|10_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-07-58.931568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T04-07-58.931568.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- '**/details_harness|winogrande|5_2024-03-01T04-07-58.931568.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T04-07-58.931568.parquet'
- config_name: results
data_files:
- split: 2024_03_01T04_07_58.931568
path:
- results_2024-03-01T04-07-58.931568.parquet
- split: latest
path:
- results_2024-03-01T04-07-58.931568.parquet
---
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-4x7b-v5](https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T04:07:58.931568](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v5/blob/main/results_2024-03-01T04-07-58.931568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653432343566328,
"acc_stderr": 0.03198522504293209,
"acc_norm": 0.6525554225781037,
"acc_norm_stderr": 0.03265700882712598,
"mc1": 0.594859241126071,
"mc1_stderr": 0.01718561172775338,
"mc2": 0.737297987613419,
"mc2_stderr": 0.014526936687711855
},
"harness|arc:challenge|25": {
"acc": 0.7175767918088737,
"acc_stderr": 0.013155456884097224,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.01283552390947384
},
"harness|hellaswag|10": {
"acc": 0.7215694084843657,
"acc_stderr": 0.004473104537026919,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.003122973632039472
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.594859241126071,
"mc1_stderr": 0.01718561172775338,
"mc2": 0.737297987613419,
"mc2_stderr": 0.014526936687711855
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627295
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_wnli_who_as | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1153
num_examples: 5
- name: test
num_bytes: 4996
num_examples: 14
- name: train
num_bytes: 6964
num_examples: 24
download_size: 14662
dataset_size: 13113
---
# Dataset Card for "MULTI_VALUE_wnli_who_as"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
asrtre/qrtdfga | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Locutusque__Hercules-4.0-Mistral-v0.2-7B | ---
pretty_name: Evaluation run of Locutusque/Hercules-4.0-Mistral-v0.2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Hercules-4.0-Mistral-v0.2-7B](https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hercules-4.0-Mistral-v0.2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T02:39:45.725937](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-4.0-Mistral-v0.2-7B/blob/main/results_2024-04-03T02-39-45.725937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6251062879958064,\n\
\ \"acc_stderr\": 0.0326560611851356,\n \"acc_norm\": 0.6293901017858107,\n\
\ \"acc_norm_stderr\": 0.03331443063112672,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.01559475363200653,\n \"mc2\": 0.40987376850150087,\n\
\ \"mc2_stderr\": 0.014140110026214792\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211676,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642666\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6215893248356901,\n\
\ \"acc_stderr\": 0.004839995745602318,\n \"acc_norm\": 0.8260306711810397,\n\
\ \"acc_norm_stderr\": 0.0037830836739860627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431367,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431367\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527824,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527824\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559802,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02892058322067561,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02892058322067561\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.01559475363200653,\n \"mc2\": 0.40987376850150087,\n\
\ \"mc2_stderr\": 0.014140110026214792\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.0115399127343454\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4541319181197877,\n \
\ \"acc_stderr\": 0.01371441094526456\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-39-45.725937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-39-45.725937.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- '**/details_harness|winogrande|5_2024-04-03T02-39-45.725937.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T02-39-45.725937.parquet'
- config_name: results
data_files:
- split: 2024_04_03T02_39_45.725937
path:
- results_2024-04-03T02-39-45.725937.parquet
- split: latest
path:
- results_2024-04-03T02-39-45.725937.parquet
---
# Dataset Card for Evaluation run of Locutusque/Hercules-4.0-Mistral-v0.2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hercules-4.0-Mistral-v0.2-7B](https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hercules-4.0-Mistral-v0.2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T02:39:45.725937](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-4.0-Mistral-v0.2-7B/blob/main/results_2024-04-03T02-39-45.725937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6251062879958064,
"acc_stderr": 0.0326560611851356,
"acc_norm": 0.6293901017858107,
"acc_norm_stderr": 0.03331443063112672,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.40987376850150087,
"mc2_stderr": 0.014140110026214792
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211676,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642666
},
"harness|hellaswag|10": {
"acc": 0.6215893248356901,
"acc_stderr": 0.004839995745602318,
"acc_norm": 0.8260306711810397,
"acc_norm_stderr": 0.0037830836739860627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431367,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431367
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527824,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527824
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559802,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02892058322067561,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02892058322067561
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.40987376850150087,
"mc2_stderr": 0.014140110026214792
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.0115399127343454
},
"harness|gsm8k|5": {
"acc": 0.4541319181197877,
"acc_stderr": 0.01371441094526456
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dnlpy/Stable-Diffusion | ---
viewer: false
--- |
edbeeching/prj_gia_dataset_atari_2B_atari_gravitar_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_gravitar environment, sample for the policy atari_2B_atari_gravitar_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2 | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-7B-v2](https://huggingface.co/perlthoughts/Chupacabra-7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T18:02:58.053786](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2/blob/main/results_2023-12-04T18-02-58.053786.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367599015709243,\n\
\ \"acc_stderr\": 0.03218025799515212,\n \"acc_norm\": 0.6396357428050704,\n\
\ \"acc_norm_stderr\": 0.0328187456646889,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5717077514762566,\n\
\ \"mc2_stderr\": 0.0156197692783717\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910478,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6473809998008365,\n\
\ \"acc_stderr\": 0.004768088918512182,\n \"acc_norm\": 0.8338976299541924,\n\
\ \"acc_norm_stderr\": 0.003714118884317389\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n\
\ \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n\
\ \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\"\
: 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n\
\ \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n\
\ \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n\
\ \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"\
acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782658,\n\
\ \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782658\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099867,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099867\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.01927099870822398,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.01927099870822398\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5717077514762566,\n\
\ \"mc2_stderr\": 0.0156197692783717\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5473843821076573,\n \
\ \"acc_stderr\": 0.013710499070935132\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|arc:challenge|25_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|arc:challenge|25_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|drop|3_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|drop|3_2023-11-23T09-18-59.989572.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T09-18-59.989572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|gsm8k|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|gsm8k|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_03T15_21_26.428024
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-21-26.428024.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hellaswag|10_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hellaswag|10_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-06-05.823190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-18-59.989572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-02-58.053786.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- '**/details_harness|winogrande|5_2023-11-23T09-06-05.823190.parquet'
- split: 2023_11_23T09_18_59.989572
path:
- '**/details_harness|winogrande|5_2023-11-23T09-18-59.989572.parquet'
- split: 2023_12_04T18_02_58.053786
path:
- '**/details_harness|winogrande|5_2023-12-04T18-02-58.053786.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T18-02-58.053786.parquet'
- config_name: results
data_files:
- split: 2023_11_23T09_06_05.823190
path:
- results_2023-11-23T09-06-05.823190.parquet
- split: 2023_11_23T09_18_59.989572
path:
- results_2023-11-23T09-18-59.989572.parquet
- split: 2023_12_03T15_21_26.428024
path:
- results_2023-12-03T15-21-26.428024.parquet
- split: 2023_12_04T18_02_58.053786
path:
- results_2023-12-04T18-02-58.053786.parquet
- split: latest
path:
- results_2023-12-04T18-02-58.053786.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2](https://huggingface.co/perlthoughts/Chupacabra-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:02:58.053786](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2/blob/main/results_2023-12-04T18-02-58.053786.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6367599015709243,
"acc_stderr": 0.03218025799515212,
"acc_norm": 0.6396357428050704,
"acc_norm_stderr": 0.0328187456646889,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5717077514762566,
"mc2_stderr": 0.0156197692783717
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910478,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179342
},
"harness|hellaswag|10": {
"acc": 0.6473809998008365,
"acc_stderr": 0.004768088918512182,
"acc_norm": 0.8338976299541924,
"acc_norm_stderr": 0.003714118884317389
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782658,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782658
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099867,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.01927099870822398,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.01927099870822398
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5717077514762566,
"mc2_stderr": 0.0156197692783717
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.01161619821577323
},
"harness|gsm8k|5": {
"acc": 0.5473843821076573,
"acc_stderr": 0.013710499070935132
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
realshyfox/LazyLlama2 | ---
license: llama2
---
|
medric49/test | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: category
dtype: string
- name: response_1
dtype: string
- name: response_2
dtype: string
- name: response_3
dtype: string
splits:
- name: train
num_bytes: 24320
num_examples: 10
download_size: 29608
dataset_size: 24320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tianzhou/auditor_sentiment | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
paperswithcode_id: null
pretty_name: Auditor_Sentiment
---
# Dataset Card for Auditor Sentiment
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
## Dataset Description
Auditor review sentiment collected by News Department
- **Point of Contact:**
Talked to COE for Auditing, currently sue@demo.org
### Dataset Summary
Auditor sentiment dataset of sentences from financial news. The dataset consists of several thousand sentences from English language financial news categorized by sentiment.
### Supported Tasks and Leaderboards
Sentiment Classification
### Languages
English
## Dataset Structure
### Data Instances
```
"sentence": "Pharmaceuticals group Orion Corp reported a fall in its third-quarter earnings that were hit by larger expenditures on R&D and marketing .",
"label": "negative"
```
### Data Fields
- sentence: a tokenized line from the dataset
- label: a label corresponding to the class as a string: 'positive' - (2), 'neutral' - (1), or 'negative' - (0)
### Data Splits
A train/test split was created randomly with a 75/25 split
## Dataset Creation
### Curation Rationale
To gather our auditor evaluations into one dataset. Previous attempts using off-the-shelf sentiment had only 70% F1, this dataset was an attempt to improve upon that performance.
### Source Data
#### Initial Data Collection and Normalization
The corpus used in this paper is made out of English news reports.
#### Who are the source language producers?
The source data was written by various auditors.
### Annotations
#### Annotation process
This release of the auditor reviews covers a collection of 4840
sentences. The selected collection of phrases was annotated by 16 people with
adequate background knowledge on financial markets. The subset here is where inter-annotation agreement was greater than 75%.
#### Who are the annotators?
They were pulled from the SME list, names are held by sue@demo.org
### Personal and Sensitive Information
There is no personal or sensitive information in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
All annotators were from the same institution and so interannotator agreement
should be understood with this taken into account.
### Licensing Information
License: Demo.Org Proprietary - DO NOT SHARE
This dataset is based on the [financial phrasebank](https://huggingface.co/datasets/financial_phrasebank) dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.