datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bigheiniuJ/JimmyLuAugConsistent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: task
dtype: string
- name: input
dtype: string
- name: aug_type
dtype: string
- name: aug_time
dtype: int64
- name: output
dtype: string
- name: options
sequence: string
- name: seed
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 2719565
num_examples: 9450
download_size: 1104965
dataset_size: 2719565
---
# Dataset Card for "JimmyLuAugConsistent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Circularmachines/batch_indexing_machine_720f_768px | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 159145222.0
num_examples: 720
download_size: 159156289
dataset_size: 159145222.0
---
# Dataset Card for "batch_indexing_machine_720f_768px"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alizee/wikiner_fr_mixed_caps | ---
language:
- fr
size_categories:
- 100K<n<1M
task_categories:
- token-classification
pretty_name: wikiner_fr
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': LOC
'2': PER
'3': MISC
'4': ORG
splits:
- name: train
num_bytes: 54139057
num_examples: 120060
- name: test
num_bytes: 5952227
num_examples: 13393
download_size: 15572314
dataset_size: 60091284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "wikiner_fr_mixed_caps"
This is an update on the dataset [Jean-Baptiste/wikiner_fr](https://huggingface.co/datasets/Jean-Baptiste/wikiner_fr) with:
- removal of duplicated examples and leakage
- random de-capitalization of words (20%)
You can see the code to create the changes in the script `update_dataset.py` in the repository.
Dataset Description (reproduced from original repo):
- **Homepage:** https://metatext.io/datasets/wikiner
- **Repository:**
- **Paper:** https://www.sciencedirect.com/science/article/pii/S0004370212000276?via%3Dihub
- **Leaderboard:**
- **Point of Contact:** |
fotiecodes/jarvis-llama2-dataset | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39366
num_examples: 229
download_size: 16940
dataset_size: 39366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
santoshtyss/uk_courts_cases | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1734427314
num_examples: 39040
- name: validation
num_bytes: 211421379
num_examples: 4000
download_size: 983466250
dataset_size: 1945848693
---
# Dataset Card for "uk_courts_cases"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2 | ---
pretty_name: Evaluation run of adamo1139/Yi-6B-200K-AEZAKMI-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/Yi-6B-200K-AEZAKMI-v2](https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T22:39:37.508676](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2/blob/main/results_2024-01-10T22-39-37.508676.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6216609966600087,\n\
\ \"acc_stderr\": 0.032603529357893186,\n \"acc_norm\": 0.6297228151355619,\n\
\ \"acc_norm_stderr\": 0.03328101108137097,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4679227286826816,\n\
\ \"mc2_stderr\": 0.01563467369999731\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n\
\ \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.014585305840007107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5461063533160725,\n\
\ \"acc_stderr\": 0.0049685216080654635,\n \"acc_norm\": 0.7120095598486357,\n\
\ \"acc_norm_stderr\": 0.00451901168841718\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.039792366374974096,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.039792366374974096\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.025634258115554955,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.025634258115554955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.02468597928623996,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.02468597928623996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153327,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153327\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139963,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139963\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630453,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630453\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223974,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223974\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4679227286826816,\n\
\ \"mc2_stderr\": 0.01563467369999731\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754775\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25094768764215314,\n \
\ \"acc_stderr\": 0.011942354768308834\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|arc:challenge|25_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|gsm8k|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hellaswag|10_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-39-37.508676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T22-39-37.508676.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- '**/details_harness|winogrande|5_2024-01-10T22-39-37.508676.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T22-39-37.508676.parquet'
- config_name: results
data_files:
- split: 2024_01_10T22_39_37.508676
path:
- results_2024-01-10T22-39-37.508676.parquet
- split: latest
path:
- results_2024-01-10T22-39-37.508676.parquet
---
# Dataset Card for Evaluation run of adamo1139/Yi-6B-200K-AEZAKMI-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/Yi-6B-200K-AEZAKMI-v2](https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T22:39:37.508676](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2/blob/main/results_2024-01-10T22-39-37.508676.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6216609966600087,
"acc_stderr": 0.032603529357893186,
"acc_norm": 0.6297228151355619,
"acc_norm_stderr": 0.03328101108137097,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4679227286826816,
"mc2_stderr": 0.01563467369999731
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.014585305840007107
},
"harness|hellaswag|10": {
"acc": 0.5461063533160725,
"acc_stderr": 0.0049685216080654635,
"acc_norm": 0.7120095598486357,
"acc_norm_stderr": 0.00451901168841718
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.039792366374974096,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.039792366374974096
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.025634258115554955,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.025634258115554955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153327,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153327
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139963,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630453,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630453
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223974,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4679227286826816,
"mc2_stderr": 0.01563467369999731
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754775
},
"harness|gsm8k|5": {
"acc": 0.25094768764215314,
"acc_stderr": 0.011942354768308834
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maxolotl/must-c-en-es-wait3-01 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 995393073
num_examples: 5241096
- name: test
num_bytes: 9963278
num_examples: 57200
- name: validation
num_bytes: 5434544
num_examples: 27561
download_size: 184391223
dataset_size: 1010790895
---
# Dataset Card for "must-c-en-es-wait3-01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carolaisabel/mini-croupier | ---
license: apache-2.0
---
|
davanstrien/testtesttest | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: annotation_id
dtype: int64
- name: annotator
dtype: int64
- name: choice
dtype:
class_label:
names:
'0': Adult content
'1': Weapons
- name: created_at
dtype: string
- name: id
dtype: int64
- name: image
dtype: image
- name: lead_time
dtype: float64
- name: updated_at
dtype: string
splits:
- name: train
num_bytes: 602063
num_examples: 4
download_size: 607006
dataset_size: 602063
language:
- en
- yue
- ar
---
# Dataset Card for "testtesttest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Doowon96/News_Val_202401 | ---
dataset_info:
features:
- name: 제목
dtype: string
- name: 키워드
dtype: string
- name: 특성추출(가중치순 상위 50개)
dtype: string
- name: 본문
dtype: string
- name: 카테고리
dtype: string
splits:
- name: val
num_bytes: 45329684
num_examples: 20000
download_size: 26283618
dataset_size: 45329684
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
---
|
bigbio/chemdner |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: CHEMDNER
homepage: https://biocreative.bioinformatics.udel.edu/resources/biocreative-iv/chemdner-corpus/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- TEXT_CLASSIFICATION
---
# Dataset Card for CHEMDNER
## Dataset Description
- **Homepage:** https://biocreative.bioinformatics.udel.edu/resources/biocreative-iv/chemdner-corpus/
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER,TXTCLASS
We present the CHEMDNER corpus, a collection of 10,000 PubMed abstracts that
contain a total of 84,355 chemical entity mentions labeled manually by expert
chemistry literature curators, following annotation guidelines specifically
defined for this task. The abstracts of the CHEMDNER corpus were selected to be
representative for all major chemical disciplines. Each of the chemical entity
mentions was manually labeled according to its structure-associated chemical
entity mention (SACEM) class: abbreviation, family, formula, identifier,
multiple, systematic and trivial.
## Citation Information
```
@article{Krallinger2015,
title = {The CHEMDNER corpus of chemicals and drugs and its annotation principles},
author = {
Krallinger, Martin and Rabal, Obdulia and Leitner, Florian and Vazquez,
Miguel and Salgado, David and Lu, Zhiyong and Leaman, Robert and Lu, Yanan
and Ji, Donghong and Lowe, Daniel M. and Sayle, Roger A. and
Batista-Navarro, Riza Theresa and Rak, Rafal and Huber, Torsten and
Rockt{"a}schel, Tim and Matos, S{'e}rgio and Campos, David and Tang,
Buzhou and Xu, Hua and Munkhdalai, Tsendsuren and Ryu, Keun Ho and Ramanan,
S. V. and Nathan, Senthil and {{Z}}itnik, Slavko and Bajec, Marko and
Weber, Lutz and Irmer, Matthias and Akhondi, Saber A. and Kors, Jan A. and
Xu, Shuo and An, Xin and Sikdar, Utpal Kumar and Ekbal, Asif and Yoshioka,
Masaharu and Dieb, Thaer M. and Choi, Miji and Verspoor, Karin and Khabsa,
Madian and Giles, C. Lee and Liu, Hongfang and Ravikumar, Komandur
Elayavilli and Lamurias, Andre and Couto, Francisco M. and Dai, Hong-Jie
and Tsai, Richard Tzong-Han and Ata, Caglar and Can, Tolga and Usi{'e},
Anabel and Alves, Rui and Segura-Bedmar, Isabel and Mart{'i}nez, Paloma
and Oyarzabal, Julen and Valencia, Alfonso
},
year = 2015,
month = {Jan},
day = 19,
journal = {Journal of Cheminformatics},
volume = 7,
number = 1,
pages = {S2},
doi = {10.1186/1758-2946-7-S1-S2},
issn = {1758-2946},
url = {https://doi.org/10.1186/1758-2946-7-S1-S2},
abstract = {
The automatic extraction of chemical information from text requires the
recognition of chemical entity mentions as one of its key steps. When
developing supervised named entity recognition (NER) systems, the
availability of a large, manually annotated text corpus is desirable.
Furthermore, large corpora permit the robust evaluation and comparison of
different approaches that detect chemicals in documents. We present the
CHEMDNER corpus, a collection of 10,000 PubMed abstracts that contain a
total of 84,355 chemical entity mentions labeled manually by expert
chemistry literature curators, following annotation guidelines specifically
defined for this task. The abstracts of the CHEMDNER corpus were selected
to be representative for all major chemical disciplines. Each of the
chemical entity mentions was manually labeled according to its
structure-associated chemical entity mention (SACEM) class: abbreviation,
family, formula, identifier, multiple, systematic and trivial. The
difficulty and consistency of tagging chemicals in text was measured using
an agreement study between annotators, obtaining a percentage agreement of
91. For a subset of the CHEMDNER corpus (the test set of 3,000 abstracts)
we provide not only the Gold Standard manual annotations, but also mentions
automatically detected by the 26 teams that participated in the BioCreative
IV CHEMDNER chemical mention recognition task. In addition, we release the
CHEMDNER silver standard corpus of automatically extracted mentions from
17,000 randomly selected PubMed abstracts. A version of the CHEMDNER corpus
in the BioC format has been generated as well. We propose a standard for
required minimum information about entity annotations for the construction
of domain specific corpora on chemical and drug entities. The CHEMDNER
corpus and annotation guidelines are available at:
ttp://www.biocreative.org/resources/biocreative-iv/chemdner-corpus/
}
}
```
|
mosh2i/mimi_tokenizer | ---
task_categories:
- text-generation
language:
- bn
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adirik/fashion_image_caption-100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22842342.0
num_examples: 100
download_size: 22823708
dataset_size: 22842342.0
---
# Dataset Card for "fashion_image_caption-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LightFury9/transliteration-telugu-words | ---
dataset_info:
features:
- name: unique_identifier
dtype: string
- name: native word
dtype: string
- name: english word
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 190122662
num_examples: 2429562
- name: test
num_bytes: 661473
num_examples: 10260
- name: validation
num_bytes: 507490
num_examples: 7681
download_size: 91663334
dataset_size: 191291625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
youyu0105/llm-MIDI2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 52189587
num_examples: 23112
download_size: 12023169
dataset_size: 52189587
---
# Dataset Card for "llm-MIDI2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KETI-AIR/kor_commonsense_qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: data_index_by_user
dtype: int32
- name: id
dtype: string
- name: question
dtype: string
- name: question_concept
dtype: string
- name: choices
struct:
- name: text
sequence: string
- name: label
sequence: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 2642161
num_examples: 9741
- name: validation
num_bytes: 327694
num_examples: 1221
- name: test
num_bytes: 309213
num_examples: 1140
download_size: 1782280
dataset_size: 3279068
license: mit
---
# Dataset Card for "kor_commonsense_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{talmor-etal-2019-commonsenseqa,
title = "{C}ommonsense{QA}: A Question Answering Challenge Targeting Commonsense Knowledge",
author = "Talmor, Alon and
Herzig, Jonathan and
Lourie, Nicholas and
Berant, Jonathan",
booktitle = "Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)",
month = jun,
year = "2019",
address = "Minneapolis, Minnesota",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N19-1421",
doi = "10.18653/v1/N19-1421",
pages = "4149--4158",
archivePrefix = "arXiv",
eprint = "1811.00937",
primaryClass = "cs",
}
``` |
tyzhu/wiki_find_passage_train200_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 317804
num_examples: 440
- name: validation
num_bytes: 33460
num_examples: 40
download_size: 146364
dataset_size: 351264
---
# Dataset Card for "wiki_find_passage_train200_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bartoszmaj/nouns_one | ---
license: openrail
dataset_info:
features:
- name: nouns
sequence: string
splits:
- name: train
num_bytes: 239483054
num_examples: 1000000
download_size: 70614383
dataset_size: 239483054
---
|
freshpearYoon/v3_train_free_concat_18 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842473160
num_examples: 2500
download_size: 1787779457
dataset_size: 3842473160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rajendrabaskota/imagenet-adm-dataset | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: label
dtype: int64
- name: img_embed
sequence: float64
splits:
- name: train
num_bytes: 495065600.0
num_examples: 80000
download_size: 403110039
dataset_size: 495065600.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-elementary_mathematics-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 148516
num_examples: 378
download_size: 82529
dataset_size: 148516
---
# Dataset Card for "mmlu-elementary_mathematics-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jiwon65/aihub_child-10k_general-6k_feature-extracted_for_test | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: audio
sequence: float32
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 18250978390
num_examples: 16000
download_size: 4220035968
dataset_size: 18250978390
---
# Dataset Card for "aihub_child-10k_general-6k_feature-extracted_for_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-sociology-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 65025
num_examples: 201
download_size: 43079
dataset_size: 65025
---
# Dataset Card for "mmlu-sociology-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_double_comparative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 50269
num_examples: 110
- name: train
num_bytes: 50296
num_examples: 104
download_size: 73644
dataset_size: 100565
---
# Dataset Card for "MULTI_VALUE_rte_double_comparative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dampish/StellarX-4b-SWE2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 5480326.0
num_examples: 1000
download_size: 675495
dataset_size: 5480326.0
---
# Dataset Card for "StellarX-4b-SWE2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_huggingtweets__gladosystem | ---
pretty_name: Evaluation run of huggingtweets/gladosystem
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggingtweets/gladosystem](https://huggingface.co/huggingtweets/gladosystem)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingtweets__gladosystem\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T03:18:40.922910](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__gladosystem/blob/main/results_2023-10-13T03-18-40.922910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.010276845637583893,\n\
\ \"em_stderr\": 0.0010328242665282317,\n \"f1\": 0.014896182885906039,\n\
\ \"f1_stderr\": 0.0011273085873104653,\n \"acc\": 0.2533543804262036,\n\
\ \"acc_stderr\": 0.0070256103461651745\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.010276845637583893,\n \"em_stderr\": 0.0010328242665282317,\n\
\ \"f1\": 0.014896182885906039,\n \"f1_stderr\": 0.0011273085873104653\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n\
\ \"acc_stderr\": 0.014051220692330349\n }\n}\n```"
repo_url: https://huggingface.co/huggingtweets/gladosystem
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T03_18_40.922910
path:
- '**/details_harness|drop|3_2023-10-13T03-18-40.922910.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T03-18-40.922910.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T03_18_40.922910
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-18-40.922910.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-18-40.922910.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T03_18_40.922910
path:
- '**/details_harness|winogrande|5_2023-10-13T03-18-40.922910.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T03-18-40.922910.parquet'
- config_name: results
data_files:
- split: 2023_10_13T03_18_40.922910
path:
- results_2023-10-13T03-18-40.922910.parquet
- split: latest
path:
- results_2023-10-13T03-18-40.922910.parquet
---
# Dataset Card for Evaluation run of huggingtweets/gladosystem
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingtweets/gladosystem
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingtweets/gladosystem](https://huggingface.co/huggingtweets/gladosystem) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingtweets__gladosystem",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T03:18:40.922910](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__gladosystem/blob/main/results_2023-10-13T03-18-40.922910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.010276845637583893,
"em_stderr": 0.0010328242665282317,
"f1": 0.014896182885906039,
"f1_stderr": 0.0011273085873104653,
"acc": 0.2533543804262036,
"acc_stderr": 0.0070256103461651745
},
"harness|drop|3": {
"em": 0.010276845637583893,
"em_stderr": 0.0010328242665282317,
"f1": 0.014896182885906039,
"f1_stderr": 0.0011273085873104653
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330349
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mwz/ur | ---
license: mit
---
|
distilled-from-one-sec-cv12/chunk_102 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1318349216
num_examples: 256888
download_size: 1347663135
dataset_size: 1318349216
---
# Dataset Card for "chunk_102"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base | ---
pretty_name: Evaluation run of Sayan01/Llama-Flan-XL2base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sayan01/Llama-Flan-XL2base](https://huggingface.co/Sayan01/Llama-Flan-XL2base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T01:29:13.925640](https://huggingface.co/datasets/open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public/blob/main/results_2023-11-25T01-29-13.925640.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23221079815429288,\n\
\ \"acc_stderr\": 0.02994811714846116,\n \"acc_norm\": 0.23187497505966656,\n\
\ \"acc_norm_stderr\": 0.030736580620987688,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.5058224656335896,\n\
\ \"mc2_stderr\": 0.016425425630600676,\n \"em\": 0.00010486577181208053,\n\
\ \"em_stderr\": 0.00010486577181208623,\n \"f1\": 0.0029037332214765076,\n\
\ \"f1_stderr\": 0.0002952362942135874\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1757679180887372,\n \"acc_stderr\": 0.01112285086312048,\n\
\ \"acc_norm\": 0.20648464163822525,\n \"acc_norm_stderr\": 0.011828865619002316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2592113124875523,\n\
\ \"acc_stderr\": 0.004373062283376514,\n \"acc_norm\": 0.2533359888468433,\n\
\ \"acc_norm_stderr\": 0.0043403282041351975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.011064151027165443,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.011064151027165443\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.01500067437357034,\n\
\ \"mc2\": 0.5058224656335896,\n \"mc2_stderr\": 0.016425425630600676\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5090765588003157,\n\
\ \"acc_stderr\": 0.014050170094497704\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.00010486577181208623,\n\
\ \"f1\": 0.0029037332214765076,\n \"f1_stderr\": 0.0002952362942135874\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Sayan01/Llama-Flan-XL2base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|arc:challenge|25_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|drop|3_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|gsm8k|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hellaswag|10_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|winogrande|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T01-29-13.925640.parquet'
- config_name: results
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- results_2023-11-25T01-29-13.925640.parquet
- split: latest
path:
- results_2023-11-25T01-29-13.925640.parquet
---
# Dataset Card for Evaluation run of Sayan01/Llama-Flan-XL2base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sayan01/Llama-Flan-XL2base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sayan01/Llama-Flan-XL2base](https://huggingface.co/Sayan01/Llama-Flan-XL2base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T01:29:13.925640](https://huggingface.co/datasets/open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public/blob/main/results_2023-11-25T01-29-13.925640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23221079815429288,
"acc_stderr": 0.02994811714846116,
"acc_norm": 0.23187497505966656,
"acc_norm_stderr": 0.030736580620987688,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.5058224656335896,
"mc2_stderr": 0.016425425630600676,
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208623,
"f1": 0.0029037332214765076,
"f1_stderr": 0.0002952362942135874
},
"harness|arc:challenge|25": {
"acc": 0.1757679180887372,
"acc_stderr": 0.01112285086312048,
"acc_norm": 0.20648464163822525,
"acc_norm_stderr": 0.011828865619002316
},
"harness|hellaswag|10": {
"acc": 0.2592113124875523,
"acc_stderr": 0.004373062283376514,
"acc_norm": 0.2533359888468433,
"acc_norm_stderr": 0.0043403282041351975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165443,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165443
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.5058224656335896,
"mc2_stderr": 0.016425425630600676
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.014050170094497704
},
"harness|drop|3": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208623,
"f1": 0.0029037332214765076,
"f1_stderr": 0.0002952362942135874
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CreativeLang/moh_metaphor | ---
license: cc-by-2.0
language:
- en
pretty_name: moh
size_categories:
- 1K<n<10K
---
# MOH Dataset
Creative Language Toolkit (CLTK) Metadata
- CL Type: Metaphor
- Task Type: detection, intrepretation
- Size: 1k~2k
- Created time: 2016
**Description**:
Moh dataset is a dataset for metaphor processing, which was released in the [paper](https://aclanthology.org/S16-2003.pdf).
For more details, please check the original paper.
## Citation
If you use this dataset, please cite:
```
@inproceedings{Mohammad2016MetaphorAA,
title={Metaphor as a Medium for Emotion: An Empirical Study},
author={Saif M. Mohammad and Ekaterina Shutova and Peter D. Turney},
booktitle={International Workshop on Semantic Evaluation},
year={2016}
}
```
## Contact
If you have any further queries, please open an issue or direct your queries to [mail](mailto:yucheng.li@surrey.ac.uk). |
felipesampaio2010/alfredoadame | ---
license: openrail
---
|
qwedsacf/grade-school-math-instructions | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 4804916
num_examples: 8792
download_size: 2554896
dataset_size: 4804916
---
# Dataset Card for grade-school-math-instructions
OpenAI's [grade-school-math](https://github.com/openai/grade-school-math) dataset converted into instructions.
## Citation Information
```bibtex
@article{cobbe2021gsm8k,
title={Training Verifiers to Solve Math Word Problems},
author={Cobbe, Karl and Kosaraju, Vineet and Bavarian, Mohammad and Chen, Mark and Jun, Heewoo and Kaiser, Lukasz and Plappert, Matthias and Tworek, Jerry and Hilton, Jacob and Nakano, Reiichiro and Hesse, Christopher and Schulman, John},
journal={arXiv preprint arXiv:2110.14168},
year={2021}
}
``` |
AdapterOcean/med_alpaca_standardized_cluster_27 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 62650843
num_examples: 6864
download_size: 16758796
dataset_size: 62650843
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_27"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-squad_v2-e06b4410-11855584 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/tinybert-6l-768d-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/tinybert-6l-768d-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sjrlee](https://huggingface.co/sjrlee) for evaluating this model. |
open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700 | ---
pretty_name: Evaluation run of yunconglong/7Bx4_DPO_700
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/7Bx4_DPO_700](https://huggingface.co/yunconglong/7Bx4_DPO_700) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T13:27:07.293796](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700/blob/main/results_2024-01-20T13-27-07.293796.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6264963262252387,\n\
\ \"acc_stderr\": 0.032586235357354824,\n \"acc_norm\": 0.6267812687226082,\n\
\ \"acc_norm_stderr\": 0.03325199889771283,\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.017502858577371272,\n \"mc2\": 0.6898757991807779,\n\
\ \"mc2_stderr\": 0.015194284964225467\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104296,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840053\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n\
\ \"acc_stderr\": 0.00465505930860262,\n \"acc_norm\": 0.8611830312686716,\n\
\ \"acc_norm_stderr\": 0.003450488042964998\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895528,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.024162780284017724,\n\
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.024162780284017724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.017502858577371272,\n \"mc2\": 0.6898757991807779,\n\
\ \"mc2_stderr\": 0.015194284964225467\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936654\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \
\ \"acc_stderr\": 0.013270100238748835\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/7Bx4_DPO_700
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|arc:challenge|25_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|gsm8k|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hellaswag|10_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T13-27-07.293796.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- '**/details_harness|winogrande|5_2024-01-20T13-27-07.293796.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T13-27-07.293796.parquet'
- config_name: results
data_files:
- split: 2024_01_20T13_27_07.293796
path:
- results_2024-01-20T13-27-07.293796.parquet
- split: latest
path:
- results_2024-01-20T13-27-07.293796.parquet
---
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_700
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_700](https://huggingface.co/yunconglong/7Bx4_DPO_700) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T13:27:07.293796](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700/blob/main/results_2024-01-20T13-27-07.293796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6264963262252387,
"acc_stderr": 0.032586235357354824,
"acc_norm": 0.6267812687226082,
"acc_norm_stderr": 0.03325199889771283,
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371272,
"mc2": 0.6898757991807779,
"mc2_stderr": 0.015194284964225467
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.014084133118104296,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840053
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.00465505930860262,
"acc_norm": 0.8611830312686716,
"acc_norm_stderr": 0.003450488042964998
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895528,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.024162780284017724,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.024162780284017724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.01660256461504994,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.01660256461504994
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371272,
"mc2": 0.6898757991807779,
"mc2_stderr": 0.015194284964225467
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936654
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748835
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lansinuote/cv.3.image_object_detection | ---
dataset_info:
features:
- name: image
dtype: image
- name: digits
sequence:
- name: bbox
sequence: int32
length: 4
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
splits:
- name: train
num_bytes: 67463846.91850163
num_examples: 6646
- name: test
num_bytes: 690276.6069133481
num_examples: 68
download_size: 60342937
dataset_size: 68154123.52541497
---
# Dataset Card for "cv.3.image_object_detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haml/1 | ---
license: mit
---
|
achinthani/test-2 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for test-2
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("achinthani/test-2")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("achinthani/test-2")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | label_selection | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | multi_label_selection | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
| ranking | Ranking | ranking | True | N/A | ['1', '2', '3', '4', '5'] |
| rating | Rating | rating | True | N/A | [1, 2, 3, 4, 5] |
| text-annotation | Feedback | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"text": "Absolutely infuriated by the lack of accountability in our government. It\u0027s time for real change!"
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"ranking": [],
"ranking-suggestion": null,
"ranking-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"rating": [],
"rating-suggestion": null,
"rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "Absolutely infuriated by the lack of accountability in our government. It\u0027s time for real change!",
"text-annotation": [],
"text-annotation-suggestion": null,
"text-annotation-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **ranking** is of type `ranking` with the following allowed values ['1', '2', '3', '4', '5'].
* **rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* **text-annotation** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* (optional) **ranking-suggestion** is of type `ranking` with the following allowed values ['1', '2', '3', '4', '5'].
* (optional) **rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **text-annotation-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-staging-eval-big_patent-y-7d0862-15806177 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- big_patent
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
metrics: []
dataset_name: big_patent
dataset_config: y
dataset_split: test
col_mapping:
text: description
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
* Dataset: big_patent
* Config: y
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
bhavyagiri/semantic-memes | ---
license: apache-2.0
---
|
ImperialIndians23/nlp_cw_training_data_8375 | ---
dataset_info:
features:
- name: par_id
dtype: string
- name: community
dtype: string
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2520387
num_examples: 8375
download_size: 1589528
dataset_size: 2520387
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_chlee10__T3Q-Platypus-MistralM7-7B | ---
pretty_name: Evaluation run of chlee10/T3Q-Platypus-MistralM7-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chlee10/T3Q-Platypus-MistralM7-7B](https://huggingface.co/chlee10/T3Q-Platypus-MistralM7-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chlee10__T3Q-Platypus-MistralM7-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T17:18:40.071702](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Platypus-MistralM7-7B/blob/main/results_2024-03-12T17-18-40.071702.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6163449592811839,\n\
\ \"acc_stderr\": 0.03299423833276716,\n \"acc_norm\": 0.6174371002530614,\n\
\ \"acc_norm_stderr\": 0.033672955197017344,\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5998754966374884,\n\
\ \"mc2_stderr\": 0.0151025785664364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.014342036483436179,\n\
\ \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859855\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\
\ \"acc_stderr\": 0.004752158936871871,\n \"acc_norm\": 0.8516231826329417,\n\
\ \"acc_norm_stderr\": 0.003547466310325385\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"\
acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039931,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039931\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\
\ \"acc_stderr\": 0.015984204545268568,\n \"acc_norm\": 0.35307262569832404,\n\
\ \"acc_norm_stderr\": 0.015984204545268568\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457155,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457155\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493274,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493274\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458617,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458617\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5998754966374884,\n\
\ \"mc2_stderr\": 0.0151025785664364\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \
\ \"acc_stderr\": 0.013504357787494037\n }\n}\n```"
repo_url: https://huggingface.co/chlee10/T3Q-Platypus-MistralM7-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|arc:challenge|25_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|gsm8k|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hellaswag|10_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T17-18-40.071702.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T17-18-40.071702.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- '**/details_harness|winogrande|5_2024-03-12T17-18-40.071702.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T17-18-40.071702.parquet'
- config_name: results
data_files:
- split: 2024_03_12T17_18_40.071702
path:
- results_2024-03-12T17-18-40.071702.parquet
- split: latest
path:
- results_2024-03-12T17-18-40.071702.parquet
---
# Dataset Card for Evaluation run of chlee10/T3Q-Platypus-MistralM7-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chlee10/T3Q-Platypus-MistralM7-7B](https://huggingface.co/chlee10/T3Q-Platypus-MistralM7-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chlee10__T3Q-Platypus-MistralM7-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T17:18:40.071702](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Platypus-MistralM7-7B/blob/main/results_2024-03-12T17-18-40.071702.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6163449592811839,
"acc_stderr": 0.03299423833276716,
"acc_norm": 0.6174371002530614,
"acc_norm_stderr": 0.033672955197017344,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5998754966374884,
"mc2_stderr": 0.0151025785664364
},
"harness|arc:challenge|25": {
"acc": 0.5955631399317406,
"acc_stderr": 0.014342036483436179,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859855
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871871,
"acc_norm": 0.8516231826329417,
"acc_norm_stderr": 0.003547466310325385
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039931,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039931
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.015984204545268568,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.015984204545268568
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457155,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457155
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493274,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235403,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458617,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458617
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5998754966374884,
"mc2_stderr": 0.0151025785664364
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494037
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ashutosh09/octopus | ---
license: mit
---
|
CyberHarem/vincennes_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vincennes/ヴィンセンス/文森斯 (Azur Lane)
This is the dataset of vincennes/ヴィンセンス/文森斯 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `blue_eyes, long_hair, blue_hair, bangs, twintails, hair_ornament, breasts, sidelocks, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 11.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 9.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 15.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 11.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 18.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vincennes_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vincennes_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | blush, 1girl, looking_at_viewer, solo, white_background, full_body, long_sleeves, skirt, black_jacket, black_thighhighs, simple_background, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | looking_at_viewer | solo | white_background | full_body | long_sleeves | skirt | black_jacket | black_thighhighs | simple_background | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------------|:------------|:---------------|:--------|:---------------|:-------------------|:--------------------|:---------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
duraad/nep-spell-2k | ---
license: mit
---
|
NTA-Dev/BlackfootTraining | ---
license: apache-2.0
---
|
cmu-lti/agents_vs_script | ---
license: other
license_name: ai2-impact-license
license_link: https://allenai.org/licenses/impact-lr
---
|
Nexdata/10464_Videos_Calling_Behavior_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
10,464 Videos - Calling Behavior Data. The data includes indoor scenes and outdoor scenes. The data covers multiple scenes, multiple shooting angles and multiple resolution. The data can be used for tasks such as calling behavior detection, calling behavior recognition and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1256?source=Huggingface
## Data size
10,464 videos
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple scenes, multiple shooting angles, multiple resolution
## Device
including surveillance camera, cellphone
## Collecting angle
looking down angle, eye-level angle
## Collecting time
day, night
## Weather distribution
sunny, cloudy
## Data format
the video data format is .mp4
## Accuracy
According to the video, the accuracy of data collecting is more than 97%; The accuracy of label naming for videos and folders is more than 97%
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps | ---
pretty_name: Evaluation run of gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps](https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-01T22:04:43.928284](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps/blob/main/results_2023-10-01T22-04-43.928284.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n\
\ \"em_stderr\": 0.0006192871806511078,\n \"f1\": 0.07745071308724842,\n\
\ \"f1_stderr\": 0.0016031429592015417,\n \"acc\": 0.4924286713583812,\n\
\ \"acc_stderr\": 0.01078503608525705\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511078,\n\
\ \"f1\": 0.07745071308724842,\n \"f1_stderr\": 0.0016031429592015417\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17664897649734648,\n \
\ \"acc_stderr\": 0.01050486250585457\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ }\n}\n```"
repo_url: https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|arc:challenge|25_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|drop|3_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-01T22-04-43.928284.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|gsm8k|5_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-01T22-04-43.928284.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hellaswag|10_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T01_39_27.936729
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T01:39:27.936729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T01:39:27.936729.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- '**/details_harness|winogrande|5_2023-10-01T22-04-43.928284.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-01T22-04-43.928284.parquet'
- config_name: results
data_files:
- split: 2023_10_01T22_04_43.928284
path:
- results_2023-10-01T22-04-43.928284.parquet
- split: latest
path:
- results_2023-10-01T22-04-43.928284.parquet
---
# Dataset Card for Evaluation run of gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps](https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T22:04:43.928284](https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps/blob/main/results_2023-10-01T22-04-43.928284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511078,
"f1": 0.07745071308724842,
"f1_stderr": 0.0016031429592015417,
"acc": 0.4924286713583812,
"acc_stderr": 0.01078503608525705
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511078,
"f1": 0.07745071308724842,
"f1_stderr": 0.0016031429592015417
},
"harness|gsm8k|5": {
"acc": 0.17664897649734648,
"acc_stderr": 0.01050486250585457
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf | ---
pretty_name: Evaluation run of meta-llama/Llama-2-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [meta-llama/Llama-2-13b-hf](https://huggingface.co/meta-llama/Llama-2-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 123 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T13:11:49.394544](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf/blob/main/results_2023-12-02T13-11-49.394544.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.22820318423047764,\n\
\ \"acc_stderr\": 0.011559914877317397\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.22820318423047764,\n \"acc_stderr\": 0.011559914877317397\n\
\ }\n}\n```"
repo_url: https://huggingface.co/meta-llama/Llama-2-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|arc:challenge|25_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|arc:challenge|25_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_drop_0
data_files:
- split: 2023_09_15T14_07_08.353318
path:
- '**/details_harness|drop|0_2023-09-15T14-07-08.353318.parquet'
- split: latest
path:
- '**/details_harness|drop|0_2023-09-15T14-07-08.353318.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_08T14_32_14.957248
path:
- '**/details_harness|drop|3_2023-09-08T14-32-14.957248.parquet'
- split: 2023_10_14T23_00_26.644553
path:
- '**/details_harness|drop|3_2023-10-14T23-00-26.644553.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T23-00-26.644553.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_08T14_32_14.957248
path:
- '**/details_harness|gsm8k|5_2023-09-08T14-32-14.957248.parquet'
- split: 2023_10_14T23_00_26.644553
path:
- '**/details_harness|gsm8k|5_2023-10-14T23-00-26.644553.parquet'
- split: 2023_12_02T13_11_49.394544
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-11-49.394544.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-11-49.394544.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hellaswag|10_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hellaswag|10_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:35:38.117975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:28:00.015478.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:26:02.660247.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T22:35:38.117975.parquet'
- split: 2023_08_23T17_28_00.015478
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T17:28:00.015478.parquet'
- split: 2023_08_29T22_26_02.660247
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:26:02.660247.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:26:02.660247.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_08T14_32_14.957248
path:
- '**/details_harness|winogrande|5_2023-09-08T14-32-14.957248.parquet'
- split: 2023_10_14T23_00_26.644553
path:
- '**/details_harness|winogrande|5_2023-10-14T23-00-26.644553.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T23-00-26.644553.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:56:56.621542.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:56:56.621542.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T19_56_56.621542
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:56:56.621542.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:56:56.621542.parquet'
- config_name: results
data_files:
- split: 2023_08_19T22_35_38.117975
path:
- results_2023-08-19T22:35:38.117975.parquet
- split: 2023_08_23T17_28_00.015478
path:
- results_2023-08-23T17:28:00.015478.parquet
- split: 2023_08_28T19_56_56.621542
path:
- results_2023-08-28T19:56:56.621542.parquet
- split: 2023_08_29T22_26_02.660247
path:
- results_2023-08-29T22:26:02.660247.parquet
- split: 2023_09_08T14_32_14.957248
path:
- results_2023-09-08T14-32-14.957248.parquet
- split: 2023_09_15T14_07_08.353318
path:
- results_2023-09-15T14-07-08.353318.parquet
- split: 2023_10_14T23_00_26.644553
path:
- results_2023-10-14T23-00-26.644553.parquet
- split: 2023_12_02T13_11_49.394544
path:
- results_2023-12-02T13-11-49.394544.parquet
- split: latest
path:
- results_2023-12-02T13-11-49.394544.parquet
---
# Dataset Card for Evaluation run of meta-llama/Llama-2-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-llama/Llama-2-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-llama/Llama-2-13b-hf](https://huggingface.co/meta-llama/Llama-2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 123 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:11:49.394544](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf/blob/main/results_2023-12-02T13-11-49.394544.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.22820318423047764,
"acc_stderr": 0.011559914877317397
},
"harness|gsm8k|5": {
"acc": 0.22820318423047764,
"acc_stderr": 0.011559914877317397
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tpremoli/CelebA-attrs-80k | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: 5_o_Clock_Shadow
dtype: int64
- name: Arched_Eyebrows
dtype: int64
- name: Attractive
dtype: int64
- name: Bags_Under_Eyes
dtype: int64
- name: Bald
dtype: int64
- name: Bangs
dtype: int64
- name: Big_Lips
dtype: int64
- name: Big_Nose
dtype: int64
- name: Black_Hair
dtype: int64
- name: Blond_Hair
dtype: int64
- name: Blurry
dtype: int64
- name: Brown_Hair
dtype: int64
- name: Bushy_Eyebrows
dtype: int64
- name: Chubby
dtype: int64
- name: Double_Chin
dtype: int64
- name: Eyeglasses
dtype: int64
- name: Goatee
dtype: int64
- name: Gray_Hair
dtype: int64
- name: Heavy_Makeup
dtype: int64
- name: High_Cheekbones
dtype: int64
- name: Male
dtype: int64
- name: Mouth_Slightly_Open
dtype: int64
- name: Mustache
dtype: int64
- name: Narrow_Eyes
dtype: int64
- name: No_Beard
dtype: int64
- name: Oval_Face
dtype: int64
- name: Pale_Skin
dtype: int64
- name: Pointy_Nose
dtype: int64
- name: Receding_Hairline
dtype: int64
- name: Rosy_Cheeks
dtype: int64
- name: Sideburns
dtype: int64
- name: Smiling
dtype: int64
- name: Straight_Hair
dtype: int64
- name: Wavy_Hair
dtype: int64
- name: Wearing_Earrings
dtype: int64
- name: Wearing_Hat
dtype: int64
- name: Wearing_Lipstick
dtype: int64
- name: Wearing_Necklace
dtype: int64
- name: Wearing_Necktie
dtype: int64
- name: Young
dtype: int64
- name: prompt_string
dtype: string
splits:
- name: train
num_bytes: 595884212.447
num_examples: 79999
- name: validation
num_bytes: 73107405.93
num_examples: 9810
- name: test
num_bytes: 73120666.79
num_examples: 9763
download_size: 700256101
dataset_size: 742112285.167
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# CelebA-128x128
CelebA with attrs at 128x128 resolution.
## Dataset Information
The attributes are binary attributes. The dataset is already split into train/test/validation sets.
This dataset has been reduced so there's 80k train samples.
## Citation
```bibtex
@inproceedings{liu2015faceattributes,
title = {Deep Learning Face Attributes in the Wild},
author = {Liu, Ziwei and Luo, Ping and Wang, Xiaogang and Tang, Xiaoou},
booktitle = {Proceedings of International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}
}
```
|
mHossain/final_train_v1_320000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 11628917.1
num_examples: 27000
- name: test
num_bytes: 1292101.9
num_examples: 3000
download_size: 5657480
dataset_size: 12921019.0
---
# Dataset Card for "final_train_v1_320000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
scholl99/semeval-2016-absa-laptop-processed | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: category
sequence: string
splits:
- name: train
num_bytes: 524413.0
num_examples: 2434
download_size: 227929
dataset_size: 524413.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dahoas/no_nl_cot_gsm8k | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
- name: nl_answer
dtype: string
splits:
- name: train
num_bytes: 6899597.6447277265
num_examples: 7127
- name: test
num_bytes: 1281050.0181956028
num_examples: 1301
- name: val
num_bytes: 238849.05078125
num_examples: 251
download_size: 4934103
dataset_size: 8419496.713704579
---
# Dataset Card for "no_nl_cot_gsm8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mvasiliniuc/iva-swift-codeint-clean-train-tokenized | ---
license: other
dataset_info:
features:
- name: ratio
dtype: float64
- name: config_or_test
dtype: bool
- name: has_no_keywords
dtype: bool
- name: has_few_assignments
dtype: bool
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
splits:
- name: train
num_bytes: 971849564
num_examples: 400000
download_size: 484282225
dataset_size: 971849564
---
|
notoriousdto/marvin-scheme | ---
license: mit
---
|
nithin1995/dfuc_sroie_image_subset | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 505314.0
num_examples: 5
download_size: 470658
dataset_size: 505314.0
---
# Dataset Card for "dfuc_sroie_image_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_splm__openchat-spin-slimorca-iter3 | ---
pretty_name: Evaluation run of splm/openchat-spin-slimorca-iter3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [splm/openchat-spin-slimorca-iter3](https://huggingface.co/splm/openchat-spin-slimorca-iter3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_splm__openchat-spin-slimorca-iter3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T19:53:19.731716](https://huggingface.co/datasets/open-llm-leaderboard/details_splm__openchat-spin-slimorca-iter3/blob/main/results_2024-02-29T19-53-19.731716.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.645468894886788,\n\
\ \"acc_stderr\": 0.03224081716644438,\n \"acc_norm\": 0.6478366786921959,\n\
\ \"acc_norm_stderr\": 0.032882171469818144,\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5899952730357957,\n\
\ \"mc2_stderr\": 0.015718397276305158\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\
\ \"acc_norm\": 0.6800341296928327,\n \"acc_norm_stderr\": 0.013631345807016191\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n\
\ \"acc_stderr\": 0.0047451035439012934,\n \"acc_norm\": 0.8396733718382793,\n\
\ \"acc_norm_stderr\": 0.0036615885079775497\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.024162780284017717,\n\
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.024162780284017717\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3575418994413408,\n\
\ \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.3575418994413408,\n\
\ \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882537,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5899952730357957,\n\
\ \"mc2_stderr\": 0.015718397276305158\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5974222896133434,\n \
\ \"acc_stderr\": 0.013508523063663418\n }\n}\n```"
repo_url: https://huggingface.co/splm/openchat-spin-slimorca-iter3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-53-19.731716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-53-19.731716.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- '**/details_harness|winogrande|5_2024-02-29T19-53-19.731716.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T19-53-19.731716.parquet'
- config_name: results
data_files:
- split: 2024_02_29T19_53_19.731716
path:
- results_2024-02-29T19-53-19.731716.parquet
- split: latest
path:
- results_2024-02-29T19-53-19.731716.parquet
---
# Dataset Card for Evaluation run of splm/openchat-spin-slimorca-iter3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [splm/openchat-spin-slimorca-iter3](https://huggingface.co/splm/openchat-spin-slimorca-iter3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_splm__openchat-spin-slimorca-iter3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T19:53:19.731716](https://huggingface.co/datasets/open-llm-leaderboard/details_splm__openchat-spin-slimorca-iter3/blob/main/results_2024-02-29T19-53-19.731716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.645468894886788,
"acc_stderr": 0.03224081716644438,
"acc_norm": 0.6478366786921959,
"acc_norm_stderr": 0.032882171469818144,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5899952730357957,
"mc2_stderr": 0.015718397276305158
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6800341296928327,
"acc_norm_stderr": 0.013631345807016191
},
"harness|hellaswag|10": {
"acc": 0.6546504680342561,
"acc_stderr": 0.0047451035439012934,
"acc_norm": 0.8396733718382793,
"acc_norm_stderr": 0.0036615885079775497
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391943,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.024162780284017717,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.024162780284017717
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882537,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5899952730357957,
"mc2_stderr": 0.015718397276305158
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089684
},
"harness|gsm8k|5": {
"acc": 0.5974222896133434,
"acc_stderr": 0.013508523063663418
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/nodoka_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nodoka/天見ノドカ/和香 (Blue Archive)
This is the dataset of nodoka/天見ノドカ/和香 (Blue Archive), containing 84 images and their tags.
The core tags of this character are `halo, red_eyes, braid, blonde_hair, long_hair, breasts, hair_bun, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 84 | 134.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nodoka_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 84 | 111.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nodoka_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 206 | 231.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nodoka_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nodoka_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, long_sleeves, solo, jacket, smile, blush, looking_at_viewer, coat, fur_trim, open_clothes, open_mouth, collared_shirt, grey_headwear, pleated_skirt, holding, simple_background, sweater, black_pantyhose, grey_gloves, white_background, grey_skirt |
| 1 | 14 |  |  |  |  |  | blush, 1girl, looking_at_viewer, obi, solo, smile, striped_clothes, long_sleeves, yukata, single_hair_bun, vertical-striped_kimono, wide_sleeves, closed_mouth, sitting, grey_kimono, open_mouth, collarbone, pink_hair, short_hair, white_kimono |
| 2 | 15 |  |  |  |  |  | onsen, blush, collarbone, water, 1girl, looking_at_viewer, naked_towel, open_mouth, partially_submerged, smile, solo, white_towel, medium_breasts, single_hair_bun, cleavage, multiple_girls, short_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | jacket | smile | blush | looking_at_viewer | coat | fur_trim | open_clothes | open_mouth | collared_shirt | grey_headwear | pleated_skirt | holding | simple_background | sweater | black_pantyhose | grey_gloves | white_background | grey_skirt | obi | striped_clothes | yukata | single_hair_bun | vertical-striped_kimono | wide_sleeves | closed_mouth | sitting | grey_kimono | collarbone | pink_hair | short_hair | white_kimono | onsen | water | naked_towel | partially_submerged | white_towel | medium_breasts | cleavage | multiple_girls |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------|:--------|:--------|:--------------------|:-------|:-----------|:---------------|:-------------|:-----------------|:----------------|:----------------|:----------|:--------------------|:----------|:------------------|:--------------|:-------------------|:-------------|:------|:------------------|:---------|:------------------|:--------------------------|:---------------|:---------------|:----------|:--------------|:-------------|:------------|:-------------|:---------------|:--------|:--------|:--------------|:----------------------|:--------------|:-----------------|:-----------|:-----------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | | X | X | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | X | | X | X | X | | | | X | | | | | | | | | | | | | | X | | | | | | X | | X | | X | X | X | X | X | X | X | X |
|
alancooney/original_relations | ---
license: mit
---
|
Makauli/gtaiv | ---
license: unknown
---
|
praneeth232/diamond-price-predictor-logs | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
amynechiban/chibano | ---
license: openrail
---
|
Aanchan/en_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 281100121
num_examples: 1892723
download_size: 155904367
dataset_size: 281100121
---
# Dataset Card for "en_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat | ---
pretty_name: Evaluation run of Qwen/Qwen1.5-4B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen1.5-4B-Chat](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T18:02:11.797174](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat/blob/main/results_2024-02-12T18-02-11.797174.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5452693175035633,\n\
\ \"acc_stderr\": 0.033961232333256236,\n \"acc_norm\": 0.555864434515964,\n\
\ \"acc_norm_stderr\": 0.03480885079816102,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4479029862950014,\n\
\ \"mc2_stderr\": 0.015185042808380176\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436175,\n\
\ \"acc_norm\": 0.4325938566552901,\n \"acc_norm_stderr\": 0.014478005694182528\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5170284803823939,\n\
\ \"acc_stderr\": 0.004986886806565644,\n \"acc_norm\": 0.6972714598685521,\n\
\ \"acc_norm_stderr\": 0.004584997935360418\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\"\
: 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302837,\n \"\
acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.708256880733945,\n \"acc_stderr\": 0.01948930096887653,\n \"acc_norm\"\
: 0.708256880733945,\n \"acc_norm_stderr\": 0.01948930096887653\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.01622501794477096,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.01622501794477096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n\
\ \"acc_stderr\": 0.014676252009319478,\n \"acc_norm\": 0.26033519553072626,\n\
\ \"acc_norm_stderr\": 0.014676252009319478\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400473,\n\
\ \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400473\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563662,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4479029862950014,\n\
\ \"mc2_stderr\": 0.015185042808380176\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.013409047676670185\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \
\ \"acc_stderr\": 0.0042380079000014035\n }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen1.5-4B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|arc:challenge|25_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|gsm8k|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hellaswag|10_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T18-02-11.797174.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- '**/details_harness|winogrande|5_2024-02-12T18-02-11.797174.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T18-02-11.797174.parquet'
- config_name: results
data_files:
- split: 2024_02_12T18_02_11.797174
path:
- results_2024-02-12T18-02-11.797174.parquet
- split: latest
path:
- results_2024-02-12T18-02-11.797174.parquet
---
# Dataset Card for Evaluation run of Qwen/Qwen1.5-4B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-4B-Chat](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T18:02:11.797174](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat/blob/main/results_2024-02-12T18-02-11.797174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5452693175035633,
"acc_stderr": 0.033961232333256236,
"acc_norm": 0.555864434515964,
"acc_norm_stderr": 0.03480885079816102,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4479029862950014,
"mc2_stderr": 0.015185042808380176
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436175,
"acc_norm": 0.4325938566552901,
"acc_norm_stderr": 0.014478005694182528
},
"harness|hellaswag|10": {
"acc": 0.5170284803823939,
"acc_stderr": 0.004986886806565644,
"acc_norm": 0.6972714598685521,
"acc_norm_stderr": 0.004584997935360418
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.708256880733945,
"acc_stderr": 0.01948930096887653,
"acc_norm": 0.708256880733945,
"acc_norm_stderr": 0.01948930096887653
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477096,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26033519553072626,
"acc_stderr": 0.014676252009319478,
"acc_norm": 0.26033519553072626,
"acc_norm_stderr": 0.014676252009319478
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.027607914087400473,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.027607914087400473
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563662,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4479029862950014,
"mc2_stderr": 0.015185042808380176
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.013409047676670185
},
"harness|gsm8k|5": {
"acc": 0.024260803639120546,
"acc_stderr": 0.0042380079000014035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gabrielmbmb/wikipedia_es_genstruct_v2_iter_1 | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: conversation
sequence:
sequence: string
- name: messages
list:
- name: message
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1823130
num_examples: 500
download_size: 976330
dataset_size: 1823130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Atipico1/trivia_test_adversary | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: gpt_adv_sentence
dtype: string
- name: is_valid_sentence
dtype: bool
- name: gpt_adv_passage
dtype: string
- name: is_valid_passage
dtype: bool
splits:
- name: train
num_bytes: 84868766
num_examples: 11313
download_size: 49647241
dataset_size: 84868766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Adminhuggingface/LORA_DATASET | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6575532.0
num_examples: 26
download_size: 6574426
dataset_size: 6575532.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LORA_DATASET"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/rbrt_test | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1270137685
num_examples: 900000
download_size: 282453475
dataset_size: 1270137685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eming/stock_original_flatten_data | ---
dataset_info:
features:
- name: trade_date
sequence: string
- name: open
sequence: float64
- name: high
sequence: float64
- name: low
sequence: float64
- name: close
sequence: float64
- name: pre_close
sequence: float64
- name: change
sequence: float64
- name: pct_chg
sequence: float64
- name: vol
sequence: float64
- name: amount
sequence: float64
- name: ts_code
dtype: string
splits:
- name: train
num_bytes: 1073280902
num_examples: 5356
download_size: 606707380
dataset_size: 1073280902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-129000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1073129
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
llm-aes/gemini_meva_full_rate_explain | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
splits:
- name: train
num_bytes: 375251
num_examples: 2000
download_size: 49222
dataset_size: 375251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
agicorp/MathInstruct | ---
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: MathInstruct
size_categories:
- 100K<n<1M
tags:
- math
---
# 🦣 MAmmoTH: Building Math Generalist Models through Hybrid Instruction Tuning
MathInstruct is a meticulously curated instruction tuning dataset that is lightweight yet generalizable. MathInstruct is compiled from 13 math rationale datasets, six of which are newly curated by this work. It uniquely focuses on the hybrid use of chain-of-thought (CoT) and program-of-thought (PoT) rationales, and ensures extensive coverage of diverse mathematical fields.
Project Page: [https://tiger-ai-lab.github.io/MAmmoTH/](https://tiger-ai-lab.github.io/MAmmoTH/)
Paper: [https://arxiv.org/pdf/2309.05653.pdf](https://arxiv.org/pdf/2309.05653.pdf)
Code: [https://github.com/TIGER-AI-Lab/MAmmoTH](https://github.com/TIGER-AI-Lab/MAmmoTH)
Models:
| | **Base Model: Llama-2** | **Base Model: Code Llama** |
|-----|---------------------------------------------------------------|--------------------------------------------------------------------------|
| 7B | 🦣 [MAmmoTH-7B](https://huggingface.co/TIGER-Lab/MAmmoTH-7B) | 🦣 [MAmmoTH-Coder-7B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-7B) |
| 13B | 🦣 [MAmmoTH-13B](https://huggingface.co/TIGER-Lab/MAmmoTH-13B) | 🦣 [MAmmoTH-Coder-13B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-13B)|
| 34B | - | 🦣 [MAmmoTH-Coder-34B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-34B)|
| 70B | 🦣 [MAmmoTH-70B](https://huggingface.co/TIGER-Lab/MAmmoTH-70B) | - |
## **License**
Please check out the license of each subset in our curated dataset MathInstruct.
| Dataset Name | License Type |
|--------------|----------------|
| GSM8K | MIT |
| GSM8K-RFT | Non listed |
| AQuA-RAT | Apache 2.0 |
| MATH | MIT |
| TheoremQA | MIT |
| Camel-Math | Attribution-NonCommercial 4.0 International |
| NumGLUE | Apache-2.0 |
| MathQA | Apache-2.0 |
| Our Curated | MIT |
## **Citation**
Please cite our paper if you use our data, model or code. Please also kindly cite the original dataset papers.
```
@article{yue2023mammoth,
title={MAmmoTH: Building Math Generalist Models through Hybrid Instruction Tuning},
author={Xiang Yue, Xingwei Qu, Ge Zhang, Yao Fu, Wenhao Huang, Huan Sun, Yu Su, Wenhu Chen},
journal={arXiv preprint arXiv:2309.05653},
year={2023}
}
``` |
CPJKU/openmic | ---
dataset_info:
features:
- name: filename
dtype: string
- name: 'true'
sequence: float32
length: 20
- name: mask
sequence: int32
length: 20
- name: mp3_bytes
dtype: binary
splits:
- name: train
num_bytes: 1790991884
num_examples: 14915
- name: test
num_bytes: 611455142
num_examples: 5085
download_size: 0
dataset_size: 2402447026
configs:
- config_name: default
data_files:
- split: train
path: data/shard_train_*
- split: test
path: data/shard_test_*
---
# CPJKU/openmic
The dataset is made available by Spotify AB under a Creative Commons Attribution 4.0 International (CC BY 4.0) license. The full terms of this license are included alongside this dataset.
This dataset is preprocessed and compressed to 32khz mp3 files. The bytes of the mp3 files are embedded.
The mp3 bytes can be decoded quickly using for [example](https://github.com/kkoutini/PaSST/blob/4519e4605989b8c2e62dccb5b928af9bf7bf8602/audioset/dataset.py#L55) or [minimp3](https://github.com/f0k/minimp3py).
Take a look at the original dataset for more information.
The original dataset contains the following:
10 second snippets of audio, in a directory format like 'audio/{0:3}/{0}.ogg'.format(sample_key)
VGGish features as JSON objects, in a directory format like 'vggish/{0:3}/{0}.json'.format(sample_key)
MD5 checksums for each OGG and JSON file
Anonymized individual responses, in 'openmic-2018-individual-responses.csv'
Aggregated labels, in 'openmic-2018-aggregated-labels.csv'
Track metadata, with licenses for each audio recording, in 'openmic-2018-metadata.csv'
A Python-friendly NPZ file of features and labels, 'openmic-2018.npz'
Sample partitions for train and test, in 'partitions/*.txt'
## Homepage
https://zenodo.org/records/1432913
## Citation
```
Humphrey, Eric J., Durand, Simon, and McFee, Brian. "OpenMIC-2018: An Open Dataset for Multiple Instrument Recognition." in Proceedings of the 19th International Society for Music Information Retrieval Conference (ISMIR), 2018.
```
## License
CC BY 4.0
|
villee/analogues1a-i | ---
license: apache-2.0
---
|
joey234/mmlu-management-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 4851.941747572816
num_examples: 25
download_size: 5783
dataset_size: 4851.941747572816
---
# Dataset Card for "mmlu-management-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
barbaroo/STS | ---
language:
- fo
size_categories:
- 1K<n<10K
---
This is a synthetic Faroese Semantic Textual Similarity dataset. Labels range from 0 (no similarity) to 5 (the two sentences are completely equivalent).
The dataset was generated by:
- Translating sentences from the Basic Faroese Language Resource Kit (BLARK) corpus to English by leveraging a Nordic LLM, GPT-Sw3.
- Sentences were compared to each other in terms of semantic similarity by Sentence BERT (SBERT, )
- Pairs of sentences were then sampled uniformly in terms of similarity score, to compile a balanced dataset.
The dataset contains 200 sentences for each class (Similarity = 0,1,2,3,4,5). |
jamsonE/dtv1 | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_wnli_reduced_relative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 135
num_examples: 1
- name: test
num_bytes: 2286
num_examples: 9
- name: train
num_bytes: 1453
num_examples: 7
download_size: 10449
dataset_size: 3874
task_categories:
- text-classification
language:
- en
---
# Dataset Card for "MULTI_VALUE_wnli_reduced_relative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arcd | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ar
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: arcd
pretty_name: ARCD
language_bcp47:
- ar-SA
dataset_info:
config_name: plain_text
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 811036
num_examples: 693
- name: validation
num_bytes: 885620
num_examples: 702
download_size: 365858
dataset_size: 1696656
configs:
- config_name: plain_text
data_files:
- split: train
path: plain_text/train-*
- split: validation
path: plain_text/validation-*
default: true
---
# Dataset Card for "arcd"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/husseinmozannar/SOQAL/tree/master/data](https://github.com/husseinmozannar/SOQAL/tree/master/data)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.94 MB
- **Size of the generated dataset:** 1.70 MB
- **Total amount of disk used:** 3.64 MB
### Dataset Summary
Arabic Reading Comprehension Dataset (ARCD) composed of 1,395 questions posed by crowdworkers on Wikipedia articles.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 1.94 MB
- **Size of the generated dataset:** 1.70 MB
- **Total amount of disk used:** 3.64 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answers": "{\"answer_start\": [34], \"text\": [\"صحابي من صحابة رسول الإسلام محمد، وعمُّه وأخوه من الرضاعة وأحد وزرائه الأربعة عشر،\"]}...",
"context": "\"حمزة بن عبد المطلب الهاشمي القرشي صحابي من صحابة رسول الإسلام محمد، وعمُّه وأخوه من الرضاعة وأحد وزرائه الأربعة عشر، وهو خير أع...",
"id": "621723207492",
"question": "من هو حمزة بن عبد المطلب؟",
"title": "حمزة بن عبد المطلب"
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train | validation |
| ---------- | ----: | ---------: |
| plain_text | 693 | 702 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{mozannar-etal-2019-neural,
title = "Neural {A}rabic Question Answering",
author = "Mozannar, Hussein and
Maamary, Elie and
El Hajal, Karl and
Hajj, Hazem",
booktitle = "Proceedings of the Fourth Arabic Natural Language Processing Workshop",
month = aug,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/W19-4612",
doi = "10.18653/v1/W19-4612",
pages = "108--118",
abstract = "This paper tackles the problem of open domain factual Arabic question answering (QA) using Wikipedia as our knowledge source. This constrains the answer of any question to be a span of text in Wikipedia. Open domain QA for Arabic entails three challenges: annotated QA datasets in Arabic, large scale efficient information retrieval and machine reading comprehension. To deal with the lack of Arabic QA datasets we present the Arabic Reading Comprehension Dataset (ARCD) composed of 1,395 questions posed by crowdworkers on Wikipedia articles, and a machine translation of the Stanford Question Answering Dataset (Arabic-SQuAD). Our system for open domain question answering in Arabic (SOQAL) is based on two components: (1) a document retriever using a hierarchical TF-IDF approach and (2) a neural reading comprehension model using the pre-trained bi-directional transformer BERT. Our experiments on ARCD indicate the effectiveness of our approach with our BERT-based reader achieving a 61.3 F1 score, and our open domain system SOQAL achieving a 27.6 F1 score.",
}
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova), [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@tayciryahmed](https://github.com/tayciryahmed) for adding this dataset. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_177 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1161648144.0
num_examples: 228132
download_size: 1185470831
dataset_size: 1161648144.0
---
# Dataset Card for "chunk_177"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shossain/qa-no-pad-16384 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 691233451
num_examples: 14119
download_size: 181098407
dataset_size: 691233451
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qa-no-pad-16384"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MBZUAI/GranD-f | ---
license: apache-2.0
---
[](https://grounding-anything.com/GranD-f)
# 🌐💬 GranD-f - Grounded Conversation Generation (GCG) Dataset
The GranD-f datasets comprise four datasets: one high-quality human-annotated set proposed in our GLaMM paper, and 3 other open-source datasets including Open-PSG, RefCOCO-g and Flickr-30k, repurposed for the GCG task using OpenAI GPT4.
## 💻 Download
```
git lfs install
git clone https://huggingface.co/datasets/MBZUAI/GranD-f
```
## 📚 Additional Resources
- **Paper:** [ArXiv](https://arxiv.org/abs/2311.03356).
- **GitHub Repository:** [GitHub - GLaMM](https://github.com/mbzuai-oryx/groundingLMM).
- **Project Page:** For a detailed overview and insights into the project, visit our [Project Page - GLaMM](https://mbzuai-oryx.github.io/groundingLMM/).
## 📜 Citations and Acknowledgments
```bibtex
@article{hanoona2023GLaMM,
title={GLaMM: Pixel Grounding Large Multimodal Model},
author={Rasheed, Hanoona and Maaz, Muhammad and Shaji, Sahal and Shaker, Abdelrahman and Khan, Salman and Cholakkal, Hisham and Anwer, Rao M. and Xing, Eric and Yang, Ming-Hsuan and Khan, Fahad S.},
journal={The IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year={2024}
}
``` |
ryrobotics/mrlove | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': train
- name: text
dtype: string
splits:
- name: train
num_bytes: 7613973.0
num_examples: 23
download_size: 7615284
dataset_size: 7613973.0
---
# Dataset Card for "mrlove"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
krvhrv/Healix-V2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 945466472
num_examples: 1171239
download_size: 542531731
dataset_size: 945466472
---
# Dataset Card for "Healix-V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mach-12/breast-cancer-microwave | ---
license: apache-2.0
---
|
jorge-henao/historias_conflicto_colombia | ---
license: apache-2.0
---
|
giux78/90000-100000-ultrafeedback-ita | ---
configs:
- config_name: default
data_files:
- split: test_gen
path: data/test_gen-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: train_sft
path: data/train_sft-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: test_gen
num_bytes: 148276089
num_examples: 28304
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: train_sft
num_bytes: 73616996
num_examples: 10000
download_size: 930852553
dataset_size: 1723985556
---
# Dataset Card for "90000-100000-ultrafeedback-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AurelPx__Pegasus-7b-slerp | ---
pretty_name: Evaluation run of AurelPx/Pegasus-7b-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AurelPx/Pegasus-7b-slerp](https://huggingface.co/AurelPx/Pegasus-7b-slerp) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AurelPx__Pegasus-7b-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T18:33:06.988554](https://huggingface.co/datasets/open-llm-leaderboard/details_AurelPx__Pegasus-7b-slerp/blob/main/results_2024-03-22T18-33-06.988554.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512551328881332,\n\
\ \"acc_stderr\": 0.03208640965508132,\n \"acc_norm\": 0.6502639303495366,\n\
\ \"acc_norm_stderr\": 0.032762873436953825,\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7712800289093449,\n\
\ \"mc2_stderr\": 0.013859800502752762\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520766,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844621,\n \"acc_norm\": 0.8904600677155945,\n\
\ \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7712800289093449,\n\
\ \"mc2_stderr\": 0.013859800502752762\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693633\n }\n}\n```"
repo_url: https://huggingface.co/AurelPx/Pegasus-7b-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|arc:challenge|25_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|gsm8k|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hellaswag|10_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T18-33-06.988554.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T18-33-06.988554.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- '**/details_harness|winogrande|5_2024-03-22T18-33-06.988554.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T18-33-06.988554.parquet'
- config_name: results
data_files:
- split: 2024_03_22T18_33_06.988554
path:
- results_2024-03-22T18-33-06.988554.parquet
- split: latest
path:
- results_2024-03-22T18-33-06.988554.parquet
---
# Dataset Card for Evaluation run of AurelPx/Pegasus-7b-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AurelPx/Pegasus-7b-slerp](https://huggingface.co/AurelPx/Pegasus-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AurelPx__Pegasus-7b-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T18:33:06.988554](https://huggingface.co/datasets/open-llm-leaderboard/details_AurelPx__Pegasus-7b-slerp/blob/main/results_2024-03-22T18-33-06.988554.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512551328881332,
"acc_stderr": 0.03208640965508132,
"acc_norm": 0.6502639303495366,
"acc_norm_stderr": 0.032762873436953825,
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7712800289093449,
"mc2_stderr": 0.013859800502752762
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520766,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844621,
"acc_norm": 0.8904600677155945,
"acc_norm_stderr": 0.003116771577319422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7712800289093449,
"mc2_stderr": 0.013859800502752762
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693633
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/sst2_non_affix | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: validation
num_bytes: 98088.14220183487
num_examples: 805
download_size: 66484
dataset_size: 98088.14220183487
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "sst2_non_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_132 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1104760320
num_examples: 216960
download_size: 1128059021
dataset_size: 1104760320
---
# Dataset Card for "chunk_132"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713225515 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 136632
num_examples: 350
download_size: 72625
dataset_size: 136632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hezarai/parstwiner | ---
task_categories:
- token-classification
language:
- fa
pretty_name: ParsTwiNER
---
ParsTwiNER dataset created by Aghajani et al. [Paper](https://paperswithcode.com/paper/parstwiner-a-corpus-for-named-entity)
> As a result of unstructured sentences and some misspellings and errors, finding named entities in a noisy environment such as social media takes much more effort.
> ParsTwiNER contains about 250k tokens, based on standard instructions like MUC-6 or CoNLL 2003, gathered from Persian Twitter. Using Cohen’s Kappa coefficient, the consistency of annotators is 0.95, a high score.
> In this study, we demonstrate that some state-of-the-art models degrade on these corpora, and trained a new model using parallel transfer learning based on the BERT architecture.
> Experimental results show that the model works well in informal Persian as well as in formal Persian. |
vwxyzjn/summarize_from_feedback_oai_preprocessing_1704166566 | ---
dataset_info:
features:
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
- name: query_token
sequence: int64
- name: query
dtype: string
- name: response0
dtype: string
- name: response0_token
sequence: int64
- name: response0_token_len
dtype: int64
- name: response1
dtype: string
- name: response1_token
sequence: int64
- name: response1_token_len
dtype: int64
- name: response0_policy
dtype: string
- name: response1_policy
dtype: string
- name: policies
dtype: string
- name: query_response0
dtype: string
- name: query_response0_token
sequence: int64
- name: query_response0_token_len
dtype: int64
- name: query_response1
dtype: string
- name: query_response1_token
sequence: int64
- name: query_response1_token_len
dtype: int64
splits:
- name: train
num_bytes: 2145150935
num_examples: 92858
- name: validation
num_bytes: 2005104645
num_examples: 86086
download_size: 285738191
dataset_size: 4150255580
---
# Dataset Card for "summarize_from_feedback_oai_preprocessing_1704166566"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_codellama__CodeLlama-13b-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-13b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T09:02:19.641763](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-hf/blob/main/results_2023-10-17T09-02-19.641763.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192065,\n \"f1\": 0.05248531879194655,\n\
\ \"f1_stderr\": 0.0012515405190332619,\n \"acc\": 0.3964846847094825,\n\
\ \"acc_stderr\": 0.011095593973496732\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192065,\n\
\ \"f1\": 0.05248531879194655,\n \"f1_stderr\": 0.0012515405190332619\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \
\ \"acc_stderr\": 0.008992888497275572\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6716653512233622,\n \"acc_stderr\": 0.01319829944971789\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T09_02_19.641763
path:
- '**/details_harness|drop|3_2023-10-17T09-02-19.641763.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T09-02-19.641763.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T09_02_19.641763
path:
- '**/details_harness|gsm8k|5_2023-10-17T09-02-19.641763.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T09-02-19.641763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T09_02_19.641763
path:
- '**/details_harness|winogrande|5_2023-10-17T09-02-19.641763.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T09-02-19.641763.parquet'
- config_name: results
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- results_2023-08-25T22:41:00.019716.parquet
- split: 2023_10_17T09_02_19.641763
path:
- results_2023-10-17T09-02-19.641763.parquet
- split: latest
path:
- results_2023-10-17T09-02-19.641763.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T09:02:19.641763](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-hf/blob/main/results_2023-10-17T09-02-19.641763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192065,
"f1": 0.05248531879194655,
"f1_stderr": 0.0012515405190332619,
"acc": 0.3964846847094825,
"acc_stderr": 0.011095593973496732
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192065,
"f1": 0.05248531879194655,
"f1_stderr": 0.0012515405190332619
},
"harness|gsm8k|5": {
"acc": 0.12130401819560273,
"acc_stderr": 0.008992888497275572
},
"harness|winogrande|5": {
"acc": 0.6716653512233622,
"acc_stderr": 0.01319829944971789
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
stockmark/business-questions | ---
license: mit
language:
- ja
---
# Stockmark Business Questions |
ValDoGrajau/ryan1 | ---
license: openrail
---
|
juletxara/mgsm_mt | ---
annotations_creators:
- found
language_creators:
- found
- expert-generated
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|gsm8k
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: multi-task-language-understanding-on-mgsm
pretty_name: Multilingual Grade School Math Benchmark (MGSM)
tags:
- math-word-problems
dataset_info:
- config_name: nllb-200-distilled-600M
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 56237
num_examples: 250
- name: fr
num_bytes: 55054
num_examples: 250
- name: de
num_bytes: 58288
num_examples: 250
- name: ru
num_bytes: 52498
num_examples: 250
- name: zh
num_bytes: 55255
num_examples: 250
- name: ja
num_bytes: 44046
num_examples: 250
- name: th
num_bytes: 51445
num_examples: 250
- name: sw
num_bytes: 50844
num_examples: 250
- name: bn
num_bytes: 46158
num_examples: 250
- name: te
num_bytes: 49928
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 495413
dataset_size: 522435
- config_name: nllb-200-distilled-1.3B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 61011
num_examples: 250
- name: fr
num_bytes: 60127
num_examples: 250
- name: de
num_bytes: 61658
num_examples: 250
- name: ru
num_bytes: 58766
num_examples: 250
- name: zh
num_bytes: 55451
num_examples: 250
- name: ja
num_bytes: 51409
num_examples: 250
- name: th
num_bytes: 49158
num_examples: 250
- name: sw
num_bytes: 57085
num_examples: 250
- name: bn
num_bytes: 54208
num_examples: 250
- name: te
num_bytes: 52710
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 537237
dataset_size: 564265
- config_name: nllb-200-1.3B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 60524
num_examples: 250
- name: fr
num_bytes: 59673
num_examples: 250
- name: de
num_bytes: 60375
num_examples: 250
- name: ru
num_bytes: 57837
num_examples: 250
- name: zh
num_bytes: 58165
num_examples: 250
- name: ja
num_bytes: 58423
num_examples: 250
- name: th
num_bytes: 51044
num_examples: 250
- name: sw
num_bytes: 58507
num_examples: 250
- name: bn
num_bytes: 53901
num_examples: 250
- name: te
num_bytes: 51593
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 545702
dataset_size: 572724
- config_name: nllb-200-3.3B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 62012
num_examples: 250
- name: fr
num_bytes: 60219
num_examples: 250
- name: de
num_bytes: 61821
num_examples: 250
- name: ru
num_bytes: 58382
num_examples: 250
- name: zh
num_bytes: 58931
num_examples: 250
- name: ja
num_bytes: 58752
num_examples: 250
- name: th
num_bytes: 57139
num_examples: 250
- name: sw
num_bytes: 60391
num_examples: 250
- name: bn
num_bytes: 55057
num_examples: 250
- name: te
num_bytes: 54888
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 563242
dataset_size: 590274
- config_name: xglm-564M
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 42608
num_examples: 250
- name: fr
num_bytes: 45691
num_examples: 250
- name: de
num_bytes: 51470
num_examples: 250
- name: ru
num_bytes: 60715
num_examples: 250
- name: zh
num_bytes: 45629
num_examples: 250
- name: ja
num_bytes: 43786
num_examples: 250
- name: th
num_bytes: 35269
num_examples: 250
- name: sw
num_bytes: 37892
num_examples: 250
- name: bn
num_bytes: 51002
num_examples: 250
- name: te
num_bytes: 98158
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 487886
dataset_size: 514902
- config_name: xglm-1.7B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 59727
num_examples: 250
- name: fr
num_bytes: 59811
num_examples: 250
- name: de
num_bytes: 60222
num_examples: 250
- name: ru
num_bytes: 58039
num_examples: 250
- name: zh
num_bytes: 44307
num_examples: 250
- name: ja
num_bytes: 40936
num_examples: 250
- name: th
num_bytes: 44383
num_examples: 250
- name: sw
num_bytes: 53708
num_examples: 250
- name: bn
num_bytes: 76978
num_examples: 250
- name: te
num_bytes: 56112
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 529882
dataset_size: 556905
- config_name: xglm-2.9B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 60811
num_examples: 250
- name: fr
num_bytes: 58777
num_examples: 250
- name: de
num_bytes: 60297
num_examples: 250
- name: ru
num_bytes: 58133
num_examples: 250
- name: zh
num_bytes: 43453
num_examples: 250
- name: ja
num_bytes: 48201
num_examples: 250
- name: th
num_bytes: 39620
num_examples: 250
- name: sw
num_bytes: 56296
num_examples: 250
- name: bn
num_bytes: 50937
num_examples: 250
- name: te
num_bytes: 46948
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 499131
dataset_size: 526155
- config_name: xglm-4.5B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 68793
num_examples: 250
- name: fr
num_bytes: 68088
num_examples: 250
- name: de
num_bytes: 76522
num_examples: 250
- name: ru
num_bytes: 63439
num_examples: 250
- name: zh
num_bytes: 58577
num_examples: 250
- name: ja
num_bytes: 56872
num_examples: 250
- name: th
num_bytes: 58692
num_examples: 250
- name: sw
num_bytes: 72348
num_examples: 250
- name: bn
num_bytes: 63835
num_examples: 250
- name: te
num_bytes: 58979
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 621817
dataset_size: 648827
- config_name: xglm-7.5B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 56510
num_examples: 250
- name: fr
num_bytes: 56170
num_examples: 250
- name: de
num_bytes: 56587
num_examples: 250
- name: ru
num_bytes: 55870
num_examples: 250
- name: zh
num_bytes: 53385
num_examples: 250
- name: ja
num_bytes: 51831
num_examples: 250
- name: th
num_bytes: 49858
num_examples: 250
- name: sw
num_bytes: 55484
num_examples: 250
- name: bn
num_bytes: 51975
num_examples: 250
- name: te
num_bytes: 51737
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 515073
dataset_size: 542089
- config_name: bloom-560m
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 47987
num_examples: 250
- name: fr
num_bytes: 43992
num_examples: 250
- name: de
num_bytes: 56995
num_examples: 250
- name: ru
num_bytes: 72240
num_examples: 250
- name: zh
num_bytes: 61450
num_examples: 250
- name: ja
num_bytes: 73445
num_examples: 250
- name: th
num_bytes: 180123
num_examples: 250
- name: sw
num_bytes: 50369
num_examples: 250
- name: bn
num_bytes: 86465
num_examples: 250
- name: te
num_bytes: 75244
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 724012
dataset_size: 750992
- config_name: bloom-1b1
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 56625
num_examples: 250
- name: fr
num_bytes: 53998
num_examples: 250
- name: de
num_bytes: 56874
num_examples: 250
- name: ru
num_bytes: 32323
num_examples: 250
- name: zh
num_bytes: 50902
num_examples: 250
- name: ja
num_bytes: 38347
num_examples: 250
- name: th
num_bytes: 20754
num_examples: 250
- name: sw
num_bytes: 27779
num_examples: 250
- name: bn
num_bytes: 34663
num_examples: 250
- name: te
num_bytes: 24958
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 372897
dataset_size: 399905
- config_name: bloom-1b7
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 44595
num_examples: 250
- name: fr
num_bytes: 48809
num_examples: 250
- name: de
num_bytes: 57435
num_examples: 250
- name: ru
num_bytes: 45954
num_examples: 250
- name: zh
num_bytes: 47375
num_examples: 250
- name: ja
num_bytes: 51493
num_examples: 250
- name: th
num_bytes: 24154
num_examples: 250
- name: sw
num_bytes: 41557
num_examples: 250
- name: bn
num_bytes: 37503
num_examples: 250
- name: te
num_bytes: 42682
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 417273
dataset_size: 444239
- config_name: bloom-3b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 60956
num_examples: 250
- name: fr
num_bytes: 61243
num_examples: 250
- name: de
num_bytes: 60337
num_examples: 250
- name: ru
num_bytes: 61329
num_examples: 250
- name: zh
num_bytes: 57078
num_examples: 250
- name: ja
num_bytes: 64180
num_examples: 250
- name: th
num_bytes: 24167
num_examples: 250
- name: sw
num_bytes: 45735
num_examples: 250
- name: bn
num_bytes: 45720
num_examples: 250
- name: te
num_bytes: 40840
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 497369
dataset_size: 524267
- config_name: bloom-7b1
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63425
num_examples: 250
- name: fr
num_bytes: 61340
num_examples: 250
- name: de
num_bytes: 61858
num_examples: 250
- name: ru
num_bytes: 60070
num_examples: 250
- name: zh
num_bytes: 59410
num_examples: 250
- name: ja
num_bytes: 57485
num_examples: 250
- name: th
num_bytes: 24974
num_examples: 250
- name: sw
num_bytes: 58232
num_examples: 250
- name: bn
num_bytes: 57178
num_examples: 250
- name: te
num_bytes: 57703
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 537348
dataset_size: 564357
- config_name: llama-7B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 55313
num_examples: 250
- name: fr
num_bytes: 61302
num_examples: 250
- name: de
num_bytes: 62152
num_examples: 250
- name: ru
num_bytes: 60929
num_examples: 250
- name: zh
num_bytes: 59157
num_examples: 250
- name: ja
num_bytes: 57356
num_examples: 250
- name: th
num_bytes: 41148
num_examples: 250
- name: sw
num_bytes: 56414
num_examples: 250
- name: bn
num_bytes: 52156
num_examples: 250
- name: te
num_bytes: 7360
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 488983
dataset_size: 515969
- config_name: llama-13B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 62592
num_examples: 250
- name: fr
num_bytes: 61965
num_examples: 250
- name: de
num_bytes: 62148
num_examples: 250
- name: ru
num_bytes: 61099
num_examples: 250
- name: zh
num_bytes: 59858
num_examples: 250
- name: ja
num_bytes: 55759
num_examples: 250
- name: th
num_bytes: 51280
num_examples: 250
- name: sw
num_bytes: 56081
num_examples: 250
- name: bn
num_bytes: 48204
num_examples: 250
- name: te
num_bytes: 6128
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 500978
dataset_size: 527796
- config_name: llama-30B
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 36577
num_examples: 250
- name: fr
num_bytes: 50763
num_examples: 250
- name: de
num_bytes: 63141
num_examples: 250
- name: ru
num_bytes: 58198
num_examples: 250
- name: zh
num_bytes: 61880
num_examples: 250
- name: ja
num_bytes: 55989
num_examples: 250
- name: th
num_bytes: 53253
num_examples: 250
- name: sw
num_bytes: 59724
num_examples: 250
- name: bn
num_bytes: 51345
num_examples: 250
- name: te
num_bytes: 6546
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 473194
dataset_size: 500098
- config_name: RedPajama-INCITE-Base-3B-v1
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 61548
num_examples: 250
- name: fr
num_bytes: 61357
num_examples: 250
- name: de
num_bytes: 58325
num_examples: 250
- name: ru
num_bytes: 61655
num_examples: 250
- name: zh
num_bytes: 61669
num_examples: 250
- name: ja
num_bytes: 59500
num_examples: 250
- name: th
num_bytes: 31415
num_examples: 250
- name: sw
num_bytes: 72056
num_examples: 250
- name: bn
num_bytes: 26241
num_examples: 250
- name: te
num_bytes: 26116
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 495561
dataset_size: 522564
- config_name: RedPajama-INCITE-7B-Base
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63198
num_examples: 250
- name: fr
num_bytes: 61124
num_examples: 250
- name: de
num_bytes: 60728
num_examples: 250
- name: ru
num_bytes: 60378
num_examples: 250
- name: zh
num_bytes: 50030
num_examples: 250
- name: ja
num_bytes: 57939
num_examples: 250
- name: th
num_bytes: 25615
num_examples: 250
- name: sw
num_bytes: 60635
num_examples: 250
- name: bn
num_bytes: 18704
num_examples: 250
- name: te
num_bytes: 21116
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 455157
dataset_size: 482149
- config_name: open_llama_3b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 59734
num_examples: 250
- name: fr
num_bytes: 59925
num_examples: 250
- name: de
num_bytes: 60270
num_examples: 250
- name: ru
num_bytes: 62725
num_examples: 250
- name: zh
num_bytes: 34013
num_examples: 250
- name: ja
num_bytes: 28163
num_examples: 250
- name: th
num_bytes: 13190
num_examples: 250
- name: sw
num_bytes: 46125
num_examples: 250
- name: bn
num_bytes: 5721
num_examples: 250
- name: te
num_bytes: 5605
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 351125
dataset_size: 378153
- config_name: open_llama_7b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 61962
num_examples: 250
- name: fr
num_bytes: 60687
num_examples: 250
- name: de
num_bytes: 60474
num_examples: 250
- name: ru
num_bytes: 61525
num_examples: 250
- name: zh
num_bytes: 36631
num_examples: 250
- name: ja
num_bytes: 29926
num_examples: 250
- name: th
num_bytes: 11176
num_examples: 250
- name: sw
num_bytes: 61601
num_examples: 250
- name: bn
num_bytes: 5080
num_examples: 250
- name: te
num_bytes: 5899
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 370615
dataset_size: 397643
- config_name: open_llama_13b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63245
num_examples: 250
- name: fr
num_bytes: 61569
num_examples: 250
- name: de
num_bytes: 62071
num_examples: 250
- name: ru
num_bytes: 60086
num_examples: 250
- name: zh
num_bytes: 37475
num_examples: 250
- name: ja
num_bytes: 32072
num_examples: 250
- name: th
num_bytes: 12902
num_examples: 250
- name: sw
num_bytes: 58870
num_examples: 250
- name: bn
num_bytes: 5624
num_examples: 250
- name: te
num_bytes: 5647
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 375230
dataset_size: 402243
- config_name: open_llama_7b_v2
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 62306
num_examples: 250
- name: fr
num_bytes: 61168
num_examples: 250
- name: de
num_bytes: 60439
num_examples: 250
- name: ru
num_bytes: 60916
num_examples: 250
- name: zh
num_bytes: 57891
num_examples: 250
- name: ja
num_bytes: 53155
num_examples: 250
- name: th
num_bytes: 34743
num_examples: 250
- name: sw
num_bytes: 58901
num_examples: 250
- name: bn
num_bytes: 34548
num_examples: 250
- name: te
num_bytes: 5253
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 464986
dataset_size: 492002
- config_name: falcon-7b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 46760
num_examples: 250
- name: fr
num_bytes: 33877
num_examples: 250
- name: de
num_bytes: 51277
num_examples: 250
- name: ru
num_bytes: 59591
num_examples: 250
- name: zh
num_bytes: 37624
num_examples: 250
- name: ja
num_bytes: 46601
num_examples: 250
- name: th
num_bytes: 37107
num_examples: 250
- name: sw
num_bytes: 31857
num_examples: 250
- name: bn
num_bytes: 18472
num_examples: 250
- name: te
num_bytes: 18376
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 357224
dataset_size: 384224
- config_name: xgen-7b-4k-base
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63837
num_examples: 250
- name: fr
num_bytes: 62076
num_examples: 250
- name: de
num_bytes: 62146
num_examples: 250
- name: ru
num_bytes: 61401
num_examples: 250
- name: zh
num_bytes: 60295
num_examples: 250
- name: ja
num_bytes: 57008
num_examples: 250
- name: th
num_bytes: 18524
num_examples: 250
- name: sw
num_bytes: 56158
num_examples: 250
- name: bn
num_bytes: 25948
num_examples: 250
- name: te
num_bytes: 5803
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 448853
dataset_size: 475878
- config_name: xgen-7b-8k-base
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63243
num_examples: 250
- name: fr
num_bytes: 60948
num_examples: 250
- name: de
num_bytes: 61832
num_examples: 250
- name: ru
num_bytes: 59217
num_examples: 250
- name: zh
num_bytes: 60354
num_examples: 250
- name: ja
num_bytes: 57012
num_examples: 250
- name: th
num_bytes: 28194
num_examples: 250
- name: sw
num_bytes: 56686
num_examples: 250
- name: bn
num_bytes: 27221
num_examples: 250
- name: te
num_bytes: 5460
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 455836
dataset_size: 482849
- config_name: xgen-7b-8k-inst
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63113
num_examples: 250
- name: fr
num_bytes: 60264
num_examples: 250
- name: de
num_bytes: 59762
num_examples: 250
- name: ru
num_bytes: 59374
num_examples: 250
- name: zh
num_bytes: 62900
num_examples: 250
- name: ja
num_bytes: 60877
num_examples: 250
- name: th
num_bytes: 26089
num_examples: 250
- name: sw
num_bytes: 57640
num_examples: 250
- name: bn
num_bytes: 24301
num_examples: 250
- name: te
num_bytes: 5290
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 455320
dataset_size: 482292
- config_name: polylm-1.7b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 55706
num_examples: 250
- name: fr
num_bytes: 55751
num_examples: 250
- name: de
num_bytes: 54071
num_examples: 250
- name: ru
num_bytes: 37159
num_examples: 250
- name: zh
num_bytes: 47577
num_examples: 250
- name: ja
num_bytes: 38931
num_examples: 250
- name: th
num_bytes: 40203
num_examples: 250
- name: sw
num_bytes: 20814
num_examples: 250
- name: bn
num_bytes: 24317
num_examples: 250
- name: te
num_bytes: 7420
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 357603
dataset_size: 384631
- config_name: polylm-13b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63444
num_examples: 250
- name: fr
num_bytes: 62136
num_examples: 250
- name: de
num_bytes: 63002
num_examples: 250
- name: ru
num_bytes: 62522
num_examples: 250
- name: zh
num_bytes: 59722
num_examples: 250
- name: ja
num_bytes: 55541
num_examples: 250
- name: th
num_bytes: 57684
num_examples: 250
- name: sw
num_bytes: 46889
num_examples: 250
- name: bn
num_bytes: 28704
num_examples: 250
- name: te
num_bytes: 7883
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 483392
dataset_size: 510209
- config_name: polylm-multialpaca-13b
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 62502
num_examples: 250
- name: fr
num_bytes: 60978
num_examples: 250
- name: de
num_bytes: 62310
num_examples: 250
- name: ru
num_bytes: 60440
num_examples: 250
- name: zh
num_bytes: 57642
num_examples: 250
- name: ja
num_bytes: 55315
num_examples: 250
- name: th
num_bytes: 59002
num_examples: 250
- name: sw
num_bytes: 51728
num_examples: 250
- name: bn
num_bytes: 31947
num_examples: 250
- name: te
num_bytes: 12891
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 490498
dataset_size: 517437
- config_name: open_llama_3b_v2
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 62474
num_examples: 250
- name: fr
num_bytes: 60493
num_examples: 250
- name: de
num_bytes: 59760
num_examples: 250
- name: ru
num_bytes: 57592
num_examples: 250
- name: zh
num_bytes: 54634
num_examples: 250
- name: ja
num_bytes: 53936
num_examples: 250
- name: th
num_bytes: 38960
num_examples: 250
- name: sw
num_bytes: 57320
num_examples: 250
- name: bn
num_bytes: 27394
num_examples: 250
- name: te
num_bytes: 4680
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 452910
dataset_size: 479925
- config_name: Llama-2-7b-hf
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63035
num_examples: 250
- name: fr
num_bytes: 61128
num_examples: 250
- name: de
num_bytes: 61496
num_examples: 250
- name: ru
num_bytes: 59918
num_examples: 250
- name: zh
num_bytes: 59415
num_examples: 250
- name: ja
num_bytes: 54466
num_examples: 250
- name: th
num_bytes: 37269
num_examples: 250
- name: sw
num_bytes: 53461
num_examples: 250
- name: bn
num_bytes: 42955
num_examples: 250
- name: te
num_bytes: 7122
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 475925
dataset_size: 502947
- config_name: Llama-2-13b-hf
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63347
num_examples: 250
- name: fr
num_bytes: 62187
num_examples: 250
- name: de
num_bytes: 63309
num_examples: 250
- name: ru
num_bytes: 62772
num_examples: 250
- name: zh
num_bytes: 62210
num_examples: 250
- name: ja
num_bytes: 59083
num_examples: 250
- name: th
num_bytes: 57690
num_examples: 250
- name: sw
num_bytes: 57538
num_examples: 250
- name: bn
num_bytes: 54947
num_examples: 250
- name: te
num_bytes: 7062
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 525803
dataset_size: 552827
- config_name: Llama-2-7b-chat-hf
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 58203
num_examples: 250
- name: fr
num_bytes: 40149
num_examples: 250
- name: de
num_bytes: 57587
num_examples: 250
- name: ru
num_bytes: 47777
num_examples: 250
- name: zh
num_bytes: 50018
num_examples: 250
- name: ja
num_bytes: 54107
num_examples: 250
- name: th
num_bytes: 41549
num_examples: 250
- name: sw
num_bytes: 61414
num_examples: 250
- name: bn
num_bytes: 37996
num_examples: 250
- name: te
num_bytes: 10156
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 434632
dataset_size: 461638
- config_name: Llama-2-13b-chat-hf
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_number
dtype: int32
- name: equation_solution
dtype: string
splits:
- name: es
num_bytes: 63304
num_examples: 250
- name: fr
num_bytes: 61708
num_examples: 250
- name: de
num_bytes: 63291
num_examples: 250
- name: ru
num_bytes: 62305
num_examples: 250
- name: zh
num_bytes: 61994
num_examples: 250
- name: ja
num_bytes: 58226
num_examples: 250
- name: th
num_bytes: 60256
num_examples: 250
- name: sw
num_bytes: 58108
num_examples: 250
- name: bn
num_bytes: 55180
num_examples: 250
- name: te
num_bytes: 6525
num_examples: 250
- name: train
num_bytes: 2682
num_examples: 8
download_size: 526574
dataset_size: 553579
---
# Dataset Card for MGSM MT
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://openai.com/blog/grade-school-math/
- **Repository:** https://github.com/openai/grade-school-math
- **Paper:** https://arxiv.org/abs/2110.14168
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Multilingual Grade School Math Benchmark (MGSM) is a benchmark of grade-school math problems, proposed in the paper [Language models are multilingual chain-of-thought reasoners](http://arxiv.org/abs/2210.03057). This dataset is the machine-translated version of MGSM in English from each language.
The same 250 problems from [GSM8K](https://arxiv.org/abs/2110.14168) are each translated via human annotators in 10 languages. The 10 languages are:
- Spanish
- French
- German
- Russian
- Chinese
- Japanese
- Thai
- Swahili
- Bengali
- Telugu
GSM8K (Grade School Math 8K) is a dataset of 8.5K high quality linguistically diverse grade school math word problems. The dataset was created to support the task of question answering on basic mathematical problems that require multi-step reasoning.
You can find the input and targets for each of the ten languages (and English) as `.tsv` files.
We also include few-shot exemplars that are also manually translated from each language in `exemplars.py`.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The same 250 problems from [GSM8K](https://arxiv.org/abs/2110.14168) are each translated via human annotators in 10 languages. The 10 languages are:
- Spanish
- French
- German
- Russian
- Chinese
- Japanese
- Thai
- Swahili
- Bengali
- Telugu
This dataset is the machine-translated version of MGSM in English from each language.
## Dataset Structure
### Data Instances
Each instance in the train split contains:
- a string for the grade-school level math question
- a string for the corresponding answer with chain-of-thought steps.
- the numeric solution to the question
- the equation solution to the question
```python
{'question': 'Question: Roger has 5 tennis balls. He buys 2 more cans of tennis balls. Each can has 3 tennis balls. How many tennis balls does he have now?',
'answer': 'Step-by-Step Answer: Roger started with 5 balls. 2 cans of 3 tennis balls each is 6 tennis balls. 5 + 6 = 11. The answer is 11.',
'answer_number': 11,
'equation_solution': '5 + 6 = 11.'}
```
Each instance in the test split contains:
- a string for the grade-school level math question
- the numeric solution to the question
```python
{'question': "Janet’s ducks lay 16 eggs per day. She eats three for breakfast every morning and bakes muffins for her friends every day with four. She sells the remainder at the farmers' market daily for $2 per fresh duck egg. How much in dollars does she make every day at the farmers' market?",
'answer': None,
'answer_number': 18,
'equation_solution': None}
```
### Data Fields
The data fields are the same among `train` and `test` splits.
- question: The question string to a grade school math problem.
- answer: The full solution string to the `question`. It contains multiple steps of reasoning with calculator annotations and the final numeric solution.
- answer_number: The numeric solution to the `question`.
- equation_solution: The equation solution to the `question`.
### Data Splits
- The train split includes 8 few-shot exemplars that are also manually translated from each language.
- The test split includes the same 250 problems from GSM8K translated via human annotators in 10 languages.
| name |train|test |
|--------|----:|---------:|
|en | 8 | 250 |
|es | 8 | 250 |
|fr | 8 | 250 |
|de | 8 | 250 |
|ru | 8 | 250 |
|zh | 8 | 250 |
|ja | 8 | 250 |
|th | 8 | 250 |
|sw | 8 | 250 |
|bn | 8 | 250 |
|te | 8 | 250 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
From the paper:
> We initially collected a starting set of a thousand problems and natural language solutions by hiring freelance contractors on Upwork (upwork.com). We then worked with Surge AI (surgehq.ai), an NLP data labeling platform, to scale up our data collection. After collecting the full dataset, we asked workers to re-solve all problems, with no workers re-solving problems they originally wrote. We checked whether their final answers agreed with the original solu- tions, and any problems that produced disagreements were either repaired or discarded. We then performed another round of agreement checks on a smaller subset of problems, finding that 1.7% of problems still produce disagreements among contractors. We estimate this to be the fraction of problems that con- tain breaking errors or ambiguities. It is possible that a larger percentage of problems contain subtle errors.
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
Surge AI (surgehq.ai)
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
The GSM8K dataset is licensed under the [MIT License](https://opensource.org/licenses/MIT).
### Citation Information
```bibtex
@article{cobbe2021gsm8k,
title={Training Verifiers to Solve Math Word Problems},
author={Cobbe, Karl and Kosaraju, Vineet and Bavarian, Mohammad and Chen, Mark and Jun, Heewoo and Kaiser, Lukasz and Plappert, Matthias and Tworek, Jerry and Hilton, Jacob and Nakano, Reiichiro and Hesse, Christopher and Schulman, John},
journal={arXiv preprint arXiv:2110.14168},
year={2021}
}
@misc{shi2022language,
title={Language Models are Multilingual Chain-of-Thought Reasoners},
author={Freda Shi and Mirac Suzgun and Markus Freitag and Xuezhi Wang and Suraj Srivats and Soroush Vosoughi and Hyung Won Chung and Yi Tay and Sebastian Ruder and Denny Zhou and Dipanjan Das and Jason Wei},
year={2022},
eprint={2210.03057},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@juletx](https://github.com/juletx) for adding this dataset. |
Nguyendo1999/mmath | ---
license: mit
task_categories:
- table-question-answering
language:
- vi
pretty_name: mmath
size_categories:
- n<1K
--- |
qgiaohc/twitter_dataset_1713110219 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 39752
num_examples: 98
download_size: 20576
dataset_size: 39752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.