datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ChayanM/mimic-llama2-150k | ---
dataset_info:
features:
- name: 'FINDINGS:'
dtype: string
splits:
- name: train
num_bytes: 59974646
num_examples: 149756
download_size: 20364864
dataset_size: 59974646
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heangborin/roofGoogleSatelliteKh | ---
license: cc-by-sa-4.0
---
|
mespinosami/map2sat10-samples | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 102633.0
num_examples: 10
download_size: 104891
dataset_size: 102633.0
---
# Dataset Card for "map2sat10-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibm/tab_fact | ---
license: cc-by-4.0
task_categories:
- text-classification
---
- **Homepage:** [TabFact](https://tabfact.github.io/index.html)
- **Repository:** [GitHub](https://github.com/wenhuchen/Table-Fact-Checking)
- **Paper:** [TabFact: A Large-scale Dataset for Table-based Fact Verification](https://arxiv.org/abs/1909.02164)
### Dataset Summary
tab_fact is a large-scale dataset for the task of fact-checking on tables. Original dataset is available @ https://huggingface.co/datasets/tab_fact and this is a slightly modified version of the dataset wherein the tabular data is processed into a specific format for ease of use.
### Citation Information
```
@inproceedings{2019TabFactA,
title={TabFact : A Large-scale Dataset for Table-based Fact Verification},
author={Wenhu Chen, Hongmin Wang, Jianshu Chen, Yunkai Zhang, Hong Wang, Shiyang Li, Xiyou Zhou and William Yang Wang},
booktitle = {International Conference on Learning Representations (ICLR)},
address = {Addis Ababa, Ethiopia},
month = {April},
year = {2020}
}
``` |
banned-historical-archives/banned-historical-archives | ---
size_categories:
- n>1T
---
# 和谐历史档案馆数据集 - Banned Historical Archives Datasets
本仓库存放已录入 banned-historical-archives.github.io 的和未录入的原始文件,其中archves*为已录入的文件,不定期从github同步;todo目录存放未录入的文件;
人民日报和文汇报已精选重要的文稿录入网站,完整的原始文件存放在:
https://huggingface.co/datasets/banned-historical-archives/rmrb
https://huggingface.co/datasets/banned-historical-archives/wenhuibao
https://huggingface.co/datasets/banned-historical-archives/wenhuibao_disk
## 其他
* 原始文件大小超过600GB,因为使用了git LFS,需要占用两倍的存储空间,克隆仓库时请确保磁盘剩余空间大于1.2TB;
* 克隆仓库时建议使用git clone --depth 1参数,否则将下载所有commit历史记录,影响下载速度
* todo文件夹中,应及时删除已录入的文稿,避免重复录入
* 使用命令清理当前commit未追踪的大文件: git lfs prune && git gc
|
Pastinacalda/SUIM | ---
license: unknown
---
|
fia24/dataverse_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: translation
struct:
- name: en
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 6289213
num_examples: 19799
- name: test
num_bytes: 696310
num_examples: 2200
download_size: 3524483
dataset_size: 6985523
---
# Dataset Card for "dataverse_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
semihGuner2002/PhishingURLsDataset | ---
license: apache-2.0
dataset_info:
features:
- name: url
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0.0'
'1': '1.0'
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 38536329.817049906
num_examples: 642533
- name: test
num_bytes: 6800578.1829500925
num_examples: 113389
download_size: 32729166
dataset_size: 45336908.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# PhishingURLDataset
This dataset is created for being used for neural network training, on phishing website detection.
It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
This dataset contains phishing websites, which are labeled with "1" and are called "malignant", and benign websites, which are labeled with "0".
### Dataset Sources
- **Kaggle Dataset on Phishing URLs:** https://www.kaggle.com/datasets/siddharthkumar25/malicious-and-benign-urls
- **USOM Phishing Websites Dataset:** https://www.usom.gov.tr/url-list.txt
- **Phishtank Dataset:** http://data.phishtank.com/data/online-valid.csv
|
Thanmay/arc-easy-hi | ---
dataset_info:
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: itv2 hi
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
splits:
- name: test
num_bytes: 2706380
num_examples: 2376
- name: validation
num_bytes: 648811
num_examples: 570
download_size: 1235614
dataset_size: 3355191
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_KeyonZeng__lion-gemma-2b | ---
pretty_name: Evaluation run of KeyonZeng/lion-gemma-2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KeyonZeng/lion-gemma-2b](https://huggingface.co/KeyonZeng/lion-gemma-2b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KeyonZeng__lion-gemma-2b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T19:39:48.412016](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__lion-gemma-2b/blob/main/results_2024-03-27T19-39-48.412016.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5668938174135245,\n\
\ \"acc_stderr\": 0.033920558669041014,\n \"acc_norm\": 0.5732411787021572,\n\
\ \"acc_norm_stderr\": 0.034638970392460354,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4792261052445388,\n\
\ \"mc2_stderr\": 0.015314648048905671\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.01460013207594709,\n\
\ \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.01460779491401305\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5536745668193587,\n\
\ \"acc_stderr\": 0.004960947388535103,\n \"acc_norm\": 0.7347142003584943,\n\
\ \"acc_norm_stderr\": 0.00440582999325873\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n\
\ \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n\
\ \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n\
\ \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.6935483870967742,\n \"acc_stderr\": 0.02622648565255388,\n\
\ \"acc_norm\": 0.6935483870967742,\n \"acc_norm_stderr\": 0.02622648565255388\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.0291857149498574,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.0291857149498574\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399811,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399811\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572922,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335452,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335452\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\
\ \"acc_stderr\": 0.015745497169049053,\n \"acc_norm\": 0.7369093231162197,\n\
\ \"acc_norm_stderr\": 0.015745497169049053\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n\
\ \"acc_stderr\": 0.01552192393352364,\n \"acc_norm\": 0.3139664804469274,\n\
\ \"acc_norm_stderr\": 0.01552192393352364\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215365,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215365\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839796,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.394393741851369,\n\
\ \"acc_stderr\": 0.01248214166563119,\n \"acc_norm\": 0.394393741851369,\n\
\ \"acc_norm_stderr\": 0.01248214166563119\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734576,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5326797385620915,\n \"acc_stderr\": 0.0201845833591022,\n \
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.0201845833591022\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4792261052445388,\n\
\ \"mc2_stderr\": 0.015314648048905671\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754027\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2630780894617134,\n \
\ \"acc_stderr\": 0.012128172607375925\n }\n}\n```"
repo_url: https://huggingface.co/KeyonZeng/lion-gemma-2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|arc:challenge|25_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|gsm8k|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hellaswag|10_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-39-48.412016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T19-39-48.412016.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- '**/details_harness|winogrande|5_2024-03-27T19-39-48.412016.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T19-39-48.412016.parquet'
- config_name: results
data_files:
- split: 2024_03_27T19_39_48.412016
path:
- results_2024-03-27T19-39-48.412016.parquet
- split: latest
path:
- results_2024-03-27T19-39-48.412016.parquet
---
# Dataset Card for Evaluation run of KeyonZeng/lion-gemma-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KeyonZeng/lion-gemma-2b](https://huggingface.co/KeyonZeng/lion-gemma-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KeyonZeng__lion-gemma-2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T19:39:48.412016](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__lion-gemma-2b/blob/main/results_2024-03-27T19-39-48.412016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5668938174135245,
"acc_stderr": 0.033920558669041014,
"acc_norm": 0.5732411787021572,
"acc_norm_stderr": 0.034638970392460354,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.4792261052445388,
"mc2_stderr": 0.015314648048905671
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.01460013207594709,
"acc_norm": 0.5110921501706485,
"acc_norm_stderr": 0.01460779491401305
},
"harness|hellaswag|10": {
"acc": 0.5536745668193587,
"acc_stderr": 0.004960947388535103,
"acc_norm": 0.7347142003584943,
"acc_norm_stderr": 0.00440582999325873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.02622648565255388,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.02622648565255388
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.0291857149498574,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.0291857149498574
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399811,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399811
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572922,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335452,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335452
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7369093231162197,
"acc_stderr": 0.015745497169049053,
"acc_norm": 0.7369093231162197,
"acc_norm_stderr": 0.015745497169049053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.01552192393352364,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.01552192393352364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215365,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215365
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839796,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.394393741851369,
"acc_stderr": 0.01248214166563119,
"acc_norm": 0.394393741851369,
"acc_norm_stderr": 0.01248214166563119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734576,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.0201845833591022,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.0201845833591022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.4792261052445388,
"mc2_stderr": 0.015314648048905671
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754027
},
"harness|gsm8k|5": {
"acc": 0.2630780894617134,
"acc_stderr": 0.012128172607375925
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CopyleftCultivars/Semisynthetic_Data_Natural_Farming_Fundamentals | ---
license: mit
---
This dataset was created semi-synthetically using a RAG system containing Korean Natural Farming teaching texts official english versions, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw.
The dataset is in json. |
sudipto-ducs/mini-platypus | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1072343
num_examples: 1000
download_size: 560564
dataset_size: 1072343
---
|
dtldhjh/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
dtype: string
- name: labels
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
dtype: string
- name: assignees
dtype: string
- name: milestone
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 24390715
num_examples: 5000
download_size: 6142822
dataset_size: 24390715
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laaaarrywang/test | ---
task_categories:
- text-classification
- feature-extraction
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
--- |
KaiLv/UDR_Yahoo | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: int64
- name: title
dtype: string
- name: content
dtype: string
- name: sentence
dtype: string
- name: len_sentence
dtype: int64
splits:
- name: train
num_bytes: 17812235
num_examples: 29150
- name: test
num_bytes: 1767766
num_examples: 3000
- name: debug
num_bytes: 3032530
num_examples: 5000
download_size: 14936274
dataset_size: 22612531
---
# Dataset Card for "UDR_Yahoo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zqjilove/chill | ---
license: openrail
---
|
CyberHarem/cornelia_arnim_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cornelia_arnim (Fire Emblem)
This is the dataset of cornelia_arnim (Fire Emblem), containing 40 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, green_eyes, pink_hair, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 74.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 36.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 99 | 79.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 65.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 99 | 125.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cornelia_arnim_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, smile, circlet, cleavage, dress, looking_at_viewer, jewelry, simple_background, bare_shoulders, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | circlet | cleavage | dress | looking_at_viewer | jewelry | simple_background | bare_shoulders | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:----------|:-----------|:--------|:--------------------|:----------|:--------------------|:-----------------|:---------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
Alexator26/800_second_face_stickers_cleared | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 127188525.0
num_examples: 206
download_size: 127191795
dataset_size: 127188525.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datasets-examples/doc-formats-jsonl-1 | ---
size_categories:
- n<1K
---
# [doc] formats - jsonl - 1
This dataset contains one jsonl file at the root.
|
liuyanchen1015/MULTI_VALUE_mnli_me_us | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 53725
num_examples: 253
- name: dev_mismatched
num_bytes: 82785
num_examples: 395
- name: test_matched
num_bytes: 75520
num_examples: 358
- name: test_mismatched
num_bytes: 77093
num_examples: 359
- name: train
num_bytes: 2415658
num_examples: 11573
download_size: 1551924
dataset_size: 2704781
---
# Dataset Card for "MULTI_VALUE_mnli_me_us"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ylacombe/google-gujarati | ---
dataset_info:
- config_name: female
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1481707913.184
num_examples: 2219
download_size: 1171501409
dataset_size: 1481707913.184
- config_name: male
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1233511380.616
num_examples: 2053
download_size: 1011014041
dataset_size: 1233511380.616
configs:
- config_name: female
data_files:
- split: train
path: female/train-*
- config_name: male
data_files:
- split: train
path: male/train-*
---
# Dataset Card for "google-gujarati"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/ThuNewsClusteringP2P | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: sentences
sequence: string
- name: labels
sequence: string
splits:
- name: test
num_bytes: 31552896
num_examples: 10
download_size: 23299710
dataset_size: 31552896
---
# Dataset Card for "ThuNewsClusteringP2P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chlee10__T3Q-MSlerp-7Bx2 | ---
pretty_name: Evaluation run of chlee10/T3Q-MSlerp-7Bx2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chlee10/T3Q-MSlerp-7Bx2](https://huggingface.co/chlee10/T3Q-MSlerp-7Bx2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chlee10__T3Q-MSlerp-7Bx2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T03:13:45.913085](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-MSlerp-7Bx2/blob/main/results_2024-03-13T03-13-45.913085.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2589646771433041,\n\
\ \"acc_stderr\": 0.030862711301155407,\n \"acc_norm\": 0.2594822816717954,\n\
\ \"acc_norm_stderr\": 0.03168729202498976,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.47282237996457954,\n\
\ \"mc2_stderr\": 0.016347102378553885\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22781569965870307,\n \"acc_stderr\": 0.01225670860232692,\n\
\ \"acc_norm\": 0.2841296928327645,\n \"acc_norm_stderr\": 0.013179442447653886\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2545309699263095,\n\
\ \"acc_stderr\": 0.0043470700195274775,\n \"acc_norm\": 0.2546305516829317,\n\
\ \"acc_norm_stderr\": 0.004347629889040943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196665,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196665\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3193548387096774,\n\
\ \"acc_stderr\": 0.02652270967466777,\n \"acc_norm\": 0.3193548387096774,\n\
\ \"acc_norm_stderr\": 0.02652270967466777\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n\
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28807339449541286,\n\
\ \"acc_stderr\": 0.01941644589263602,\n \"acc_norm\": 0.28807339449541286,\n\
\ \"acc_norm_stderr\": 0.01941644589263602\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993656,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035282,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035282\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n\
\ \"acc_stderr\": 0.02523459344713618,\n \"acc_norm\": 0.17040358744394618,\n\
\ \"acc_norm_stderr\": 0.02523459344713618\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034501,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034501\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3639705882352941,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.3639705882352941,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.47282237996457954,\n\
\ \"mc2_stderr\": 0.016347102378553885\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.01399845361092432\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/chlee10/T3Q-MSlerp-7Bx2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|arc:challenge|25_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|gsm8k|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hellaswag|10_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T03-13-45.913085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T03-13-45.913085.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- '**/details_harness|winogrande|5_2024-03-13T03-13-45.913085.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T03-13-45.913085.parquet'
- config_name: results
data_files:
- split: 2024_03_13T03_13_45.913085
path:
- results_2024-03-13T03-13-45.913085.parquet
- split: latest
path:
- results_2024-03-13T03-13-45.913085.parquet
---
# Dataset Card for Evaluation run of chlee10/T3Q-MSlerp-7Bx2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chlee10/T3Q-MSlerp-7Bx2](https://huggingface.co/chlee10/T3Q-MSlerp-7Bx2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chlee10__T3Q-MSlerp-7Bx2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T03:13:45.913085](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-MSlerp-7Bx2/blob/main/results_2024-03-13T03-13-45.913085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2589646771433041,
"acc_stderr": 0.030862711301155407,
"acc_norm": 0.2594822816717954,
"acc_norm_stderr": 0.03168729202498976,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123892,
"mc2": 0.47282237996457954,
"mc2_stderr": 0.016347102378553885
},
"harness|arc:challenge|25": {
"acc": 0.22781569965870307,
"acc_stderr": 0.01225670860232692,
"acc_norm": 0.2841296928327645,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.2545309699263095,
"acc_stderr": 0.0043470700195274775,
"acc_norm": 0.2546305516829317,
"acc_norm_stderr": 0.004347629889040943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196665,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196665
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3193548387096774,
"acc_stderr": 0.02652270967466777,
"acc_norm": 0.3193548387096774,
"acc_norm_stderr": 0.02652270967466777
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28807339449541286,
"acc_stderr": 0.01941644589263602,
"acc_norm": 0.28807339449541286,
"acc_norm_stderr": 0.01941644589263602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993656,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035282,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035282
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17040358744394618,
"acc_stderr": 0.02523459344713618,
"acc_norm": 0.17040358744394618,
"acc_norm_stderr": 0.02523459344713618
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034501,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034501
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3639705882352941,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.3639705882352941,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123892,
"mc2": 0.47282237996457954,
"mc2_stderr": 0.016347102378553885
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.01399845361092432
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/bibeak_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bibeak/バイビーク/柏喙 (Arknights)
This is the dataset of bibeak/バイビーク/柏喙 (Arknights), containing 23 images and their tags.
The core tags of this character are `long_hair, breasts, hair_ornament, grey_hair, hair_flower, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 45.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bibeak_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 23 | 37.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bibeak_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 71.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bibeak_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bibeak_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, dress, frills, looking_at_viewer, solo, white_gloves, bare_shoulders, blue_rose, closed_mouth, feather_hair, green_eyes, large_breasts, wings, yellow_eyes, character_name, dated, feathers, medium_breasts, off_shoulder, simple_background, sitting |
| 1 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, dress, frills, white_gloves, skirt, blue_flower, closed_mouth, grey_eyes, hair_bow, holding_sword, rose, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | frills | looking_at_viewer | solo | white_gloves | bare_shoulders | blue_rose | closed_mouth | feather_hair | green_eyes | large_breasts | wings | yellow_eyes | character_name | dated | feathers | medium_breasts | off_shoulder | simple_background | sitting | skirt | blue_flower | grey_eyes | hair_bow | holding_sword | rose | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:--------------------|:-------|:---------------|:-----------------|:------------|:---------------|:---------------|:-------------|:----------------|:--------|:--------------|:-----------------|:--------|:-----------|:-----------------|:---------------|:--------------------|:----------|:--------|:--------------|:------------|:-----------|:----------------|:-------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
andresaparicio/apa | ---
license: openrail
---
|
amitrajitbh1/communities_unproc | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: author
dtype: string
- name: body
dtype: string
- name: normalizedBody
dtype: string
- name: subreddit
dtype: string
- name: subreddit_id
dtype: string
- name: id
dtype: string
- name: content
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 2450236670.538612
num_examples: 497952
download_size: 1497430442
dataset_size: 2450236670.538612
---
# Dataset Card for "communities_unproc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pvduy/ultrafeedback-trans-jp-cleaned-v1 | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 246493971
num_examples: 56263
- name: test
num_bytes: 5557339
num_examples: 1000
download_size: 122204740
dataset_size: 252051310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
suhan81/test01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LahiruLowe/niv2_filtered_2pertask | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 2918226
num_examples: 3112
download_size: 1628617
dataset_size: 2918226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "niv2_filtered_2pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/train_free_40 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604569160
num_examples: 10000
download_size: 1229689634
dataset_size: 9604569160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tdh87/MIxed40real60aiGenerated | ---
license: mit
---
|
Falah/chapter9_0_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2878
num_examples: 10
download_size: 4316
dataset_size: 2878
---
# Dataset Card for "chapter9_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taylorbobaylor/tay-stack-v1 | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 628428
num_examples: 66
download_size: 231587
dataset_size: 628428
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wltjr1007/DomainNet | ---
language:
- en
license: other
size_categories:
- 100K<n<1M
task_categories:
- image-classification
- zero-shot-image-classification
task_ids:
- multi-class-image-classification
- multi-class-classification
pretty_name: DomainNet
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': aircraft_carrier
'1': airplane
'2': alarm_clock
'3': ambulance
'4': angel
'5': animal_migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball
'17': baseball_bat
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday_cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling_fan
'66': cello
'67': cell_phone
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee_cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise_ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving_board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire_hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip_flops
'120': floor_lamp
'121': flower
'122': flying_saucer
'123': foot
'124': fork
'125': frog
'126': frying_pan
'127': garden
'128': garden_hose
'129': giraffe
'130': goatee
'131': golf_club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey_puck
'146': hockey_stick
'147': horse
'148': hospital
'149': hot_air_balloon
'150': hot_dog
'151': hot_tub
'152': hourglass
'153': house
'154': house_plant
'155': hurricane
'156': ice_cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light_bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paintbrush
'206': paint_can
'207': palm_tree
'208': panda
'209': pants
'210': paper_clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup_truck
'221': picture_frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police_car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power_outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote_control
'242': rhinoceros
'243': rifle
'244': river
'245': roller_coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school_bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea_turtle
'256': see_saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping_bag
'267': smiley_face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer_ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop_sign
'289': stove
'290': strawberry
'291': streetlight
'292': string_bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing_set
'299': sword
'300': syringe
'301': table
'302': teapot
'303': teddy-bear
'304': telephone
'305': television
'306': tennis_racquet
'307': tent
'308': The_Eiffel_Tower
'309': The_Great_Wall_of_China
'310': The_Mona_Lisa
'311': tiger
'312': toaster
'313': toe
'314': toilet
'315': tooth
'316': toothbrush
'317': toothpaste
'318': tornado
'319': tractor
'320': traffic_light
'321': train
'322': tree
'323': triangle
'324': trombone
'325': truck
'326': trumpet
'327': t-shirt
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing_machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine_bottle
'340': wine_glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
- name: domain
dtype:
class_label:
names:
'0': clipart
'1': infograph
'2': painting
'3': quickdraw
'4': real
'5': sketch
- name: image_path
dtype: string
splits:
- name: train
num_bytes: 1098474093.3600001
num_examples: 409832
- name: test
num_bytes: 471724034.589
num_examples: 176743
download_size: 18521436207
dataset_size: 1570198127.9490001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Data downloaded from [WILDS](https://wilds.stanford.edu/) ([Download](https://wilds.stanford.edu/downloads), [paper](https://arxiv.org/abs/1812.01754), [project](https://ai.bu.edu/M3SDA/)).
This dataset contains some copyrighted material whose use has not been specifically authorized by the copyright owners. In an effort to advance scientific research, we make this material available for academic research. We believe this constitutes a fair use of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit for non-commercial research and educational purposes. For more information on fair use please click [here](https://www.law.cornell.edu/uscode/text/17/107). If you wish to use copyrighted material on this site or in our dataset for purposes of your own that go beyond non-commercial research and academic purposes, you must obtain permission directly from the copyright owner. (adapted from the [official DomainNet website](https://ai.bu.edu/M3SDA/#refs)) |
Back-up/chung-khoan-demo-p6 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EgilKarlsen/BGL_RoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211883223
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_RoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZelaAI/lj_speech_2048 | ---
dataset_info:
features:
- name: text_tokens
sequence: int64
- name: audio_tokens_1
sequence: int64
- name: audio_tokens_2
sequence: int64
splits:
- name: train
num_bytes: 163157868
num_examples: 3331
download_size: 17054634
dataset_size: 163157868
---
# Dataset Card for "lj_speech_2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucasroberto125/Lessa1 | ---
license: artistic-2.0
---
|
CyberHarem/izayoi_sakuya_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of izayoi_sakuya/いざよいさくや/十六夜咲夜/이자요이사쿠야 (Touhou)
This is the dataset of izayoi_sakuya/いざよいさくや/十六夜咲夜/이자요이사쿠야 (Touhou), containing 500 images and their tags.
The core tags of this character are `braid, twin_braids, maid_headdress, short_hair, grey_hair, bow, hair_bow, blue_eyes, bangs, breasts, ribbon, green_bow, hair_between_eyes, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 809.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izayoi_sakuya_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 465.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izayoi_sakuya_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1213 | 941.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izayoi_sakuya_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 714.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izayoi_sakuya_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1213 | 1.27 GiB | [Download](https://huggingface.co/datasets/CyberHarem/izayoi_sakuya_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/izayoi_sakuya_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, maid, solo, waist_apron, blue_dress, knife, puffy_short_sleeves, holding, red_eyes, wrist_cuffs |
| 1 | 24 |  |  |  |  |  | 1girl, blue_dress, looking_at_viewer, maid_apron, puffy_short_sleeves, solo, waist_apron, white_apron, weapon, holding_knife, white_shirt, frilled_apron, red_eyes, closed_mouth, medium_breasts, wrist_cuffs, between_fingers, cowboy_shot, standing |
| 2 | 5 |  |  |  |  |  | 1girl, blue_dress, holding_knife, looking_at_viewer, puffy_short_sleeves, simple_background, solo, waist_apron, white_background, between_fingers, white_apron, black_pantyhose, frills, maid_apron, medium_breasts, closed_mouth, large_breasts, shoes, white_shirt, wrist_cuffs |
| 3 | 8 |  |  |  |  |  | 1girl, knife, maid, solo, apron, red_eyes, fingerless_gloves |
| 4 | 6 |  |  |  |  |  | 1girl, solo, lingerie, looking_at_viewer, navel, on_back, open_shirt, maid, thighhighs, white_bra, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | maid | solo | waist_apron | blue_dress | knife | puffy_short_sleeves | holding | red_eyes | wrist_cuffs | maid_apron | white_apron | weapon | holding_knife | white_shirt | frilled_apron | closed_mouth | medium_breasts | between_fingers | cowboy_shot | standing | simple_background | white_background | black_pantyhose | frills | large_breasts | shoes | apron | fingerless_gloves | lingerie | navel | on_back | open_shirt | thighhighs | white_bra | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------|:--------------|:-------------|:--------|:----------------------|:----------|:-----------|:--------------|:-------------|:--------------|:---------|:----------------|:--------------|:----------------|:---------------|:-----------------|:------------------|:--------------|:-----------|:--------------------|:-------------------|:------------------|:---------|:----------------|:--------|:--------|:--------------------|:-----------|:--------|:----------|:-------------|:-------------|:------------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | | X | | | X | X | X | | X | X | | X | X | X | | | X | X | X | X | X | X | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
Mohammad019/mib_resume_1.0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 9253747.770933978
num_examples: 1987
- name: test
num_bytes: 1154972.041867955
num_examples: 248
- name: validation
num_bytes: 1159629.1871980675
num_examples: 249
download_size: 5335612
dataset_size: 11568349.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_heegyu__LIMA-13b-hf | ---
pretty_name: Evaluation run of heegyu/LIMA-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA-13b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T00:08:39.312434](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-10-22T00-08-39.312434.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.000468506503036833,\n \"f1\": 0.05783347315436248,\n\
\ \"f1_stderr\": 0.0013197558360646307,\n \"acc\": 0.4303028471618438,\n\
\ \"acc_stderr\": 0.009812237277361156\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.000468506503036833,\n\
\ \"f1\": 0.05783347315436248,\n \"f1_stderr\": 0.0013197558360646307\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \
\ \"acc_stderr\": 0.00783145873705871\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.0117930158176636\n\
\ }\n}\n```"
repo_url: https://huggingface.co/heegyu/LIMA-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T00_08_39.312434
path:
- '**/details_harness|drop|3_2023-10-22T00-08-39.312434.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T00-08-39.312434.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T00_08_39.312434
path:
- '**/details_harness|gsm8k|5_2023-10-22T00-08-39.312434.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T00-08-39.312434.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T00_08_39.312434
path:
- '**/details_harness|winogrande|5_2023-10-22T00-08-39.312434.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T00-08-39.312434.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- results_2023-08-17T19:40:51.725558.parquet
- split: 2023_10_22T00_08_39.312434
path:
- results_2023-10-22T00-08-39.312434.parquet
- split: latest
path:
- results_2023-10-22T00-08-39.312434.parquet
---
# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/LIMA-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T00:08:39.312434](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-10-22T00-08-39.312434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.000468506503036833,
"f1": 0.05783347315436248,
"f1_stderr": 0.0013197558360646307,
"acc": 0.4303028471618438,
"acc_stderr": 0.009812237277361156
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.000468506503036833,
"f1": 0.05783347315436248,
"f1_stderr": 0.0013197558360646307
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.00783145873705871
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.0117930158176636
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-13B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T22:53:07.629534](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16/blob/main/results_2023-10-22T22-53-07.629534.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902982666,\n \"f1\": 0.0607822986577181,\n\
\ \"f1_stderr\": 0.0013583957676382913,\n \"acc\": 0.43739636770101,\n\
\ \"acc_stderr\": 0.010228023491905505\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982666,\n\
\ \"f1\": 0.0607822986577181,\n \"f1_stderr\": 0.0013583957676382913\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627487\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-13B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T22_53_07.629534
path:
- '**/details_harness|drop|3_2023-10-22T22-53-07.629534.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T22-53-07.629534.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T22_53_07.629534
path:
- '**/details_harness|gsm8k|5_2023-10-22T22-53-07.629534.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T22-53-07.629534.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:08:39.202746.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:08:39.202746.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T22_53_07.629534
path:
- '**/details_harness|winogrande|5_2023-10-22T22-53-07.629534.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T22-53-07.629534.parquet'
- config_name: results
data_files:
- split: 2023_07_24T15_08_39.202746
path:
- results_2023-07-24T15:08:39.202746.parquet
- split: 2023_10_22T22_53_07.629534
path:
- results_2023-10-22T22-53-07.629534.parquet
- split: latest
path:
- results_2023-10-22T22-53-07.629534.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T22:53:07.629534](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16/blob/main/results_2023-10-22T22-53-07.629534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982666,
"f1": 0.0607822986577181,
"f1_stderr": 0.0013583957676382913,
"acc": 0.43739636770101,
"acc_stderr": 0.010228023491905505
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982666,
"f1": 0.0607822986577181,
"f1_stderr": 0.0013583957676382913
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627487
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
modelloosrvcc/Announcer | ---
license: openrail
---
|
e2e_nlg | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: e2e
pretty_name: End-to-End NLG Challenge
tags:
- meaning-representation-to-text
dataset_info:
features:
- name: meaning_representation
dtype: string
- name: human_reference
dtype: string
splits:
- name: train
num_bytes: 9435824
num_examples: 42061
- name: validation
num_bytes: 1171723
num_examples: 4672
- name: test
num_bytes: 1320205
num_examples: 4693
download_size: 11812316
dataset_size: 11927752
---
# Dataset Card for End-to-End NLG Challenge
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](http://www.macs.hw.ac.uk/InteractionLab/E2E/)
- **Repository:** [repository](https://github.com/tuetschek/e2e-dataset/)
- **Paper:** [paper](https://arxiv.org/abs/1706.09254)
- **Leaderboard:** [leaderboard](http://www.macs.hw.ac.uk/InteractionLab/E2E/)
### Dataset Summary
The E2E dataset is used for training end-to-end, data-driven natural language generation systems in the restaurant domain, which is ten times bigger than existing, frequently used datasets in this area.
The E2E dataset poses new challenges:
(1) its human reference texts show more lexical richness and syntactic variation, including discourse phenomena;
(2) generating from this set requires content selection. As such, learning from this dataset promises more natural, varied and less template-like system utterances.
E2E is released in the following paper where you can find more details and baseline results:
https://arxiv.org/abs/1706.09254
### Supported Tasks and Leaderboards
- `text2text-generation-other-meaning-representation-to-text`: The dataset can be used to train a model to generate descriptions in the restaurant domain from meaning representations, which consists in taking as input some data about a restaurant and generate a sentence in natural language that presents the different aspects of the data about the restaurant.. Success on this task is typically measured by achieving a *high* [BLEU](https://huggingface.co/metrics/bleu), [NIST](https://huggingface.co/metrics/nist), [METEOR](https://huggingface.co/metrics/meteor), [Rouge-L](https://huggingface.co/metrics/rouge), [CIDEr](https://huggingface.co/metrics/cider). The TGen model (Dusek and Jurcıcek, 2016a) was used a baseline, had the following scores:
| | BLEU | NIST | METEOR | ROUGE_L | CIDEr |
| -------- | ------ | ------ | ------ | ------- | ------ |
| BASELINE | 0.6593 | 8.6094 | 0.4483 | 0.6850 | 2.2338 |
This task has an inactive leaderboard which can be found [here](http://www.macs.hw.ac.uk/InteractionLab/E2E/) and ranks models based on the metrics above.
### Languages
The dataset is in english (en).
## Dataset Structure
### Data Instances
Example of one instance:
```
{'human_reference': 'The Vaults pub near Café Adriatic has a 5 star rating. Prices start at £30.',
'meaning_representation': 'name[The Vaults], eatType[pub], priceRange[more than £30], customer rating[5 out of 5], near[Café Adriatic]'}
```
### Data Fields
- `human_reference`: string, the text is natural language that describes the different characteristics in the meaning representation
- `meaning_representation`: list of slots and values to generate a description from
Each MR consists of 3–8 attributes (slots), such as name, food or area, and their values.
### Data Splits
The dataset is split into training, validation and testing sets (in a 76.5-8.5-15 ratio), keeping a similar distribution of MR and reference text lengths and ensuring that MRs in different sets are distinct.
| | train | validation | test |
| ----- |-------:|------------:|------:|
| N. Instances | 42061 | 4672 | 4693 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
The data was collected using the CrowdFlower platform and quality-controlled following Novikova et al. (2016).
#### Who are the source language producers?
[More Information Needed]
### Annotations
Following Novikova et al. (2016), the E2E data was collected using pictures as stimuli, which was shown to elicit significantly more natural, more informative, and better phrased human references than textual MRs.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{dusek.etal2020:csl,
title = {Evaluating the {{State}}-of-the-{{Art}} of {{End}}-to-{{End Natural Language Generation}}: {{The E2E NLG Challenge}}},
author = {Du{\v{s}}ek, Ond\v{r}ej and Novikova, Jekaterina and Rieser, Verena},
year = {2020},
month = jan,
volume = {59},
pages = {123--156},
doi = {10.1016/j.csl.2019.06.009},
archivePrefix = {arXiv},
eprint = {1901.11528},
eprinttype = {arxiv},
journal = {Computer Speech \& Language}
```
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
Tngarg/chinese_test | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: tweet
dtype: string
- name: sentiment
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1574339
num_examples: 11513
download_size: 1143927
dataset_size: 1574339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chinese_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/ultrachat_200k_filtered_1707919621 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_gen
num_bytes: 30484069
num_examples: 1000
- name: test_sft
num_bytes: 39592502
num_examples: 1000
- name: train_gen
num_bytes: 29613744
num_examples: 1000
- name: train_sft
num_bytes: 39521233
num_examples: 1000
download_size: 50859072
dataset_size: 139211548
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
|
freshpearYoon/train_free_2 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604855104
num_examples: 10000
download_size: 1387232122
dataset_size: 9604855104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gguichard/wsd_myriade_synth_data_gpt4turbo_val_v2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 466852
num_examples: 676
download_size: 111646
dataset_size: 466852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wsd_myriade_synth_data_gpt4turbo_val_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/deep_learning_books_dataset | ---
dataset_info:
features:
- name: page_no
dtype: int64
- name: page_content
dtype: string
splits:
- name: train
num_bytes: 1030431
num_examples: 474
download_size: 509839
dataset_size: 1030431
---
# Deep Learning Books Dataset
## Dataset Information
- **Features**:
- `page_no`: Integer (int64) - Page number in the book.
- `page_content`: String - Text content of the page.
- **Splits**:
- `train`: Training split.
- Number of examples: 474
- Number of bytes: 1,030,431
- **Download Size**: 509,839 bytes
- **Dataset Size**: 1,030,431 bytes
## Dataset Application
This dataset "deep_learning_books_dataset" contains text data from various pages of books related to deep learning.
It can be used for various natural language processing (NLP) tasks such as text classification,
language modeling, text generation, and more.
### Using Python and Hugging Face's Transformers Library
To use this dataset for NLP text generation and language modeling tasks, you can follow these steps:
1. Install the required libraries:
```python
pip install datasets
from datasets import load_dataset
dataset = load_dataset("Falah/deep_learning_books_dataset")
```
## Citation
Please use the following citation when referencing this dataset:
```
@dataset{deep_learning_books_dataset,
author = {Falah.G.Salieh},
title = {Deep Learning Books Dataset,},
year = {2023},
publisher = {HuggingFace Hub},
version = {1.0},
location = {Online},
url = {https://huggingface.co/datasets/Falah/deep_learning_books_dataset}
}
```
### Apache License:
The "{Deep Learning Books Dataset" is distributed under the Apache License 2.0.
The specific licensing and usage terms for this dataset can be found in the dataset repository or documentation.
Please make sure to review and comply with the applicable license and usage terms before downloading and using the dataset.
|
fia24/augmented_bangla_money10k_80_20 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '10'
'2': '100'
'3': '1000'
'4': '2'
'5': '20'
'6': '200'
'7': '5'
'8': '50'
'9': '500'
splits:
- name: train
num_bytes: 75522474.4
num_examples: 8000
- name: test
num_bytes: 18872730.6
num_examples: 2000
download_size: 88787135
dataset_size: 94395205.0
---
# Dataset Card for "augmented_bangla_money10k_80_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_macadeliccc__Mistral-7B-v0.2-OpenHermes | ---
pretty_name: Evaluation run of macadeliccc/Mistral-7B-v0.2-OpenHermes
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/Mistral-7B-v0.2-OpenHermes](https://huggingface.co/macadeliccc/Mistral-7B-v0.2-OpenHermes)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__Mistral-7B-v0.2-OpenHermes\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:19:33.883845](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Mistral-7B-v0.2-OpenHermes/blob/main/results_2024-03-27T18-19-33.883845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5919484433220782,\n\
\ \"acc_stderr\": 0.03315093682640097,\n \"acc_norm\": 0.6029228152554236,\n\
\ \"acc_norm_stderr\": 0.03406103831759794,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950236,\n \"mc2\": 0.4308666648887643,\n\
\ \"mc2_stderr\": 0.0144918724444321\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394908,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.01451268252312834\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6082453694483171,\n\
\ \"acc_stderr\": 0.004871447106554927,\n \"acc_norm\": 0.816072495518821,\n\
\ \"acc_norm_stderr\": 0.0038663327313633263\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851095,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851095\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622841,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622841\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n\
\ \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835795,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835795\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539873,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144363,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144363\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455333,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455333\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950236,\n \"mc2\": 0.4308666648887643,\n\
\ \"mc2_stderr\": 0.0144918724444321\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722757\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/Mistral-7B-v0.2-OpenHermes
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-19-33.883845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-19-33.883845.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- '**/details_harness|winogrande|5_2024-03-27T18-19-33.883845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-19-33.883845.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_19_33.883845
path:
- results_2024-03-27T18-19-33.883845.parquet
- split: latest
path:
- results_2024-03-27T18-19-33.883845.parquet
---
# Dataset Card for Evaluation run of macadeliccc/Mistral-7B-v0.2-OpenHermes
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/Mistral-7B-v0.2-OpenHermes](https://huggingface.co/macadeliccc/Mistral-7B-v0.2-OpenHermes) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__Mistral-7B-v0.2-OpenHermes",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:19:33.883845](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__Mistral-7B-v0.2-OpenHermes/blob/main/results_2024-03-27T18-19-33.883845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5919484433220782,
"acc_stderr": 0.03315093682640097,
"acc_norm": 0.6029228152554236,
"acc_norm_stderr": 0.03406103831759794,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950236,
"mc2": 0.4308666648887643,
"mc2_stderr": 0.0144918724444321
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394908,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.01451268252312834
},
"harness|hellaswag|10": {
"acc": 0.6082453694483171,
"acc_stderr": 0.004871447106554927,
"acc_norm": 0.816072495518821,
"acc_norm_stderr": 0.0038663327313633263
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851095,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851095
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622841,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622841
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835795,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835795
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539873,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513398,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144363,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144363
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455333,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950236,
"mc2": 0.4308666648887643,
"mc2_stderr": 0.0144918724444321
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722757
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ShinojiResearch__Senku-70B-Full | ---
pretty_name: Evaluation run of ShinojiResearch/Senku-70B-Full
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ShinojiResearch/Senku-70B-Full](https://huggingface.co/ShinojiResearch/Senku-70B-Full)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ShinojiResearch__Senku-70B-Full\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T22:09:19.492878](https://huggingface.co/datasets/open-llm-leaderboard/details_ShinojiResearch__Senku-70B-Full/blob/main/results_2024-02-09T22-09-19.492878.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7505923110347043,\n\
\ \"acc_stderr\": 0.02868102140930387,\n \"acc_norm\": 0.7535032633378316,\n\
\ \"acc_norm_stderr\": 0.029238591782710294,\n \"mc1\": 0.4541003671970624,\n\
\ \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.619572860600058,\n\
\ \"mc2_stderr\": 0.014905285944975092\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880534,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838793\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6940848436566421,\n\
\ \"acc_stderr\": 0.004598522271041222,\n \"acc_norm\": 0.8788090021907986,\n\
\ \"acc_norm_stderr\": 0.003256821418857317\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02628055093284808,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02628055093284808\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7404255319148936,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.7404255319148936,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.696551724137931,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.696551724137931,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5423280423280423,\n \"acc_stderr\": 0.025658868862058322,\n \"\
acc_norm\": 0.5423280423280423,\n \"acc_norm_stderr\": 0.025658868862058322\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432302,\n \"\
acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432302\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047926,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047926\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.0210206726808279,\n \
\ \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.0210206726808279\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4148148148148148,\n \"acc_stderr\": 0.03003984245406929,\n \
\ \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.03003984245406929\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673936,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848614,\n \"\
acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848614\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065505,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065505\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.025998379092356513,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.025998379092356513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"\
acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n\
\ \"acc_stderr\": 0.010770472014886715,\n \"acc_norm\": 0.8991060025542784,\n\
\ \"acc_norm_stderr\": 0.010770472014886715\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n\
\ \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6815642458100558,\n\
\ \"acc_stderr\": 0.015581008080360274,\n \"acc_norm\": 0.6815642458100558,\n\
\ \"acc_norm_stderr\": 0.015581008080360274\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.021339479988816027,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.021339479988816027\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.021670058885510782,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.021670058885510782\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957185,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957185\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5815602836879432,\n \"acc_stderr\": 0.029427994039419998,\n \
\ \"acc_norm\": 0.5815602836879432,\n \"acc_norm_stderr\": 0.029427994039419998\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5821382007822686,\n\
\ \"acc_stderr\": 0.012596744108998569,\n \"acc_norm\": 0.5821382007822686,\n\
\ \"acc_norm_stderr\": 0.012596744108998569\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098608,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098608\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502791,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502791\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759415,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759415\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4541003671970624,\n\
\ \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.619572860600058,\n\
\ \"mc2_stderr\": 0.014905285944975092\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065583\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337688\n }\n}\n```"
repo_url: https://huggingface.co/ShinojiResearch/Senku-70B-Full
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-53-37.284416.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-09-19.492878.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-09-19.492878.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- '**/details_harness|winogrande|5_2024-02-09T21-53-37.284416.parquet'
- split: 2024_02_09T22_09_19.492878
path:
- '**/details_harness|winogrande|5_2024-02-09T22-09-19.492878.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T22-09-19.492878.parquet'
- config_name: results
data_files:
- split: 2024_02_09T21_53_37.284416
path:
- results_2024-02-09T21-53-37.284416.parquet
- split: 2024_02_09T22_09_19.492878
path:
- results_2024-02-09T22-09-19.492878.parquet
- split: latest
path:
- results_2024-02-09T22-09-19.492878.parquet
---
# Dataset Card for Evaluation run of ShinojiResearch/Senku-70B-Full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ShinojiResearch/Senku-70B-Full](https://huggingface.co/ShinojiResearch/Senku-70B-Full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ShinojiResearch__Senku-70B-Full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T22:09:19.492878](https://huggingface.co/datasets/open-llm-leaderboard/details_ShinojiResearch__Senku-70B-Full/blob/main/results_2024-02-09T22-09-19.492878.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7505923110347043,
"acc_stderr": 0.02868102140930387,
"acc_norm": 0.7535032633378316,
"acc_norm_stderr": 0.029238591782710294,
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.619572860600058,
"mc2_stderr": 0.014905285944975092
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880534,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838793
},
"harness|hellaswag|10": {
"acc": 0.6940848436566421,
"acc_stderr": 0.004598522271041222,
"acc_norm": 0.8788090021907986,
"acc_norm_stderr": 0.003256821418857317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284808,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284808
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7404255319148936,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.7404255319148936,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.696551724137931,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.696551724137931,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5423280423280423,
"acc_stderr": 0.025658868862058322,
"acc_norm": 0.5423280423280423,
"acc_norm_stderr": 0.025658868862058322
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047926,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047926
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.0210206726808279,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.0210206726808279
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.03003984245406929,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.03003984245406929
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673936,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848614,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848614
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065505,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065505
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356513,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.025135382356604227,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.025135382356604227
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.010770472014886715,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.010770472014886715
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6815642458100558,
"acc_stderr": 0.015581008080360274,
"acc_norm": 0.6815642458100558,
"acc_norm_stderr": 0.015581008080360274
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.021339479988816027,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.021339479988816027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.021670058885510782,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.021670058885510782
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957185,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957185
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5815602836879432,
"acc_stderr": 0.029427994039419998,
"acc_norm": 0.5815602836879432,
"acc_norm_stderr": 0.029427994039419998
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5821382007822686,
"acc_stderr": 0.012596744108998569,
"acc_norm": 0.5821382007822686,
"acc_norm_stderr": 0.012596744108998569
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098608,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098608
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502791,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502791
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759415,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759415
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.619572860600058,
"mc2_stderr": 0.014905285944975092
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065583
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337688
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jeremygf/domains-app-alpha | ---
dataset_info:
features:
- name: text
dtype: string
- name: length
dtype: int64
- name: ids
sequence: int64
splits:
- name: train
num_bytes: 98587954
num_examples: 534152
download_size: 9814171
dataset_size: 98587954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/closure_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Closure/クロージャ/可露希尔 (Arknights)
This is the dataset of Closure/クロージャ/可露希尔 (Arknights), containing 132 images and their tags.
The core tags of this character are `black_hair, long_hair, pointy_ears, red_eyes, hair_between_eyes, breasts, two_side_up`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 132 | 225.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/closure_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 132 | 187.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/closure_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 327 | 374.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/closure_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/closure_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, black_jacket, white_shirt, open_jacket, solo, looking_at_viewer, black_choker, smile, long_sleeves, simple_background, collarbone, white_background, open_mouth, upper_body, blush, fang, belt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | white_shirt | open_jacket | solo | looking_at_viewer | black_choker | smile | long_sleeves | simple_background | collarbone | white_background | open_mouth | upper_body | blush | fang | belt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:--------------|:-------|:--------------------|:---------------|:--------|:---------------|:--------------------|:-------------|:-------------------|:-------------|:-------------|:--------|:-------|:-------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Sao10K__Venomia-m7 | ---
pretty_name: Evaluation run of Sao10K/Venomia-m7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Venomia-m7](https://huggingface.co/Sao10K/Venomia-m7) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Venomia-m7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T19:38:57.975905](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-m7/blob/main/results_2023-12-09T19-38-57.975905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5994561896653553,\n\
\ \"acc_stderr\": 0.032914574924459074,\n \"acc_norm\": 0.6051903322222639,\n\
\ \"acc_norm_stderr\": 0.03358493802168653,\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.49078721070216347,\n\
\ \"mc2_stderr\": 0.015495976475887885\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042194\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6635132443736308,\n\
\ \"acc_stderr\": 0.004715419139697519,\n \"acc_norm\": 0.8399721171081458,\n\
\ \"acc_norm_stderr\": 0.0036588262081016106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n \
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.03074630074212451,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.03074630074212451\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443135,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443135\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.01541449448790323,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.01541449448790323\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457152,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457152\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.01264300462379021,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.01264300462379021\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215923,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.49078721070216347,\n\
\ \"mc2_stderr\": 0.015495976475887885\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174782\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3237300985595148,\n \
\ \"acc_stderr\": 0.01288824739737114\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Venomia-m7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-38-57.975905.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- '**/details_harness|winogrande|5_2023-12-09T19-38-57.975905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T19-38-57.975905.parquet'
- config_name: results
data_files:
- split: 2023_12_09T19_38_57.975905
path:
- results_2023-12-09T19-38-57.975905.parquet
- split: latest
path:
- results_2023-12-09T19-38-57.975905.parquet
---
# Dataset Card for Evaluation run of Sao10K/Venomia-m7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Venomia-m7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Venomia-m7](https://huggingface.co/Sao10K/Venomia-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Venomia-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:38:57.975905](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-m7/blob/main/results_2023-12-09T19-38-57.975905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5994561896653553,
"acc_stderr": 0.032914574924459074,
"acc_norm": 0.6051903322222639,
"acc_norm_stderr": 0.03358493802168653,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.49078721070216347,
"mc2_stderr": 0.015495976475887885
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042194
},
"harness|hellaswag|10": {
"acc": 0.6635132443736308,
"acc_stderr": 0.004715419139697519,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.03074630074212451,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.03074630074212451
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443135,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.01541449448790323,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.01541449448790323
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457152,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457152
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.01264300462379021,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.01264300462379021
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215923,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.49078721070216347,
"mc2_stderr": 0.015495976475887885
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174782
},
"harness|gsm8k|5": {
"acc": 0.3237300985595148,
"acc_stderr": 0.01288824739737114
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
haseong8012/child-50k-adult-30k_for-train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: audio
sequence: float32
splits:
- name: train
num_bytes: 15410653561
num_examples: 80000
download_size: 13245154984
dataset_size: 15410653561
---
# Dataset Card for "child-50k-adult-30k_for-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yura32000/eurosat | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
- name: choices
dtype: int64
- name: prices
dtype: int64
splits:
- name: train
num_bytes: 73997723.2
num_examples: 21600
- name: test
num_bytes: 9241099.7
num_examples: 2700
- name: valid
num_bytes: 9232043.9
num_examples: 2700
download_size: 91992228
dataset_size: 92470866.80000001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
juancheng/MBE_height | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 8983230.0
num_examples: 49
- name: validation
num_bytes: 2439966.0
num_examples: 13
download_size: 0
dataset_size: 11423196.0
---
# Dataset Card for "MBE_height"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Phando/nlvr2-resized | ---
dataset_info:
features:
- name: identifier
dtype: string
- name: sentence
dtype: string
- name: label
dtype: string
- name: image0
dtype: image
- name: image1
dtype: image
splits:
- name: train
num_bytes: 40320845946.75
num_examples: 86373
- name: validation
num_bytes: 3285815561.536
num_examples: 6982
download_size: 42185258650
dataset_size: 43606661508.286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_cola_subord_conjunction_doubling | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 98
num_examples: 1
- name: test
num_bytes: 185
num_examples: 2
- name: train
num_bytes: 3322
num_examples: 34
download_size: 7915
dataset_size: 3605
---
# Dataset Card for "MULTI_VALUE_cola_subord_conjunction_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
faisal-hugging-face/plat-disease | ---
license: other
---
|
Pratha1m/dataset_with_ocr | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
sequence:
sequence:
sequence: uint8
- name: answer
dtype: string
- name: question
dtype: string
- name: words
sequence: string
- name: boxes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 147847251
num_examples: 904
- name: test
num_bytes: 30521871
num_examples: 190
download_size: 37387817
dataset_size: 178369122
---
# Dataset Card for "dataset_with_ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reza-alipour/CaHumnE | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: col1
dtype: image
- name: col2
dtype: image
- name: col3
dtype: image
- name: col4
dtype: image
splits:
- name: train
num_bytes: 270773121.0
num_examples: 200
download_size: 270781217
dataset_size: 270773121.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Veucci/turkish-lyric-to-genre | ---
license: cc-by-nc-4.0
size_categories:
- 1K<n<10K
task_categories:
- text-classification
language:
- tr
tags:
- music
---
# Song Lyrics Dataset
## Description
This dataset contains a collection of song lyrics from various artists and genres in Turkish. It is intended to be used for research, analysis, and other non-commercial purposes.
## Dataset Details
The dataset is organized in a tabular format with the following columns:
- `Genre` (int): Genre of the lyrics
- `Lyrics` (str): The lyrics of the song.
- Pop: 1085 rows
- Rock: 765 rows
- Hip-Hop: 969 rows
- Arabesk: 353 rows
## Usage
Feel free to use this dataset for non-commercial purposes such as academic research, natural language processing tasks, sentiment analysis, or personal projects. You are allowed to analyze, modify, and derive insights from the dataset.
If you use this dataset in your work, we kindly request that you provide attribution by citing this repository or linking back to it.
## License
This dataset is released under the Creative Commons Attribution-NonCommercial license. This means that you are not allowed to use the dataset for commercial purposes. For detailed information about the license, please refer to the [LICENSE](./LICENSE) file.
## Contact
If you have any questions, suggestions, or concerns regarding this dataset, please feel free to reach out to email at [efe.ozkan732@gmail.com](mailto:efe.ozkan732@gmail.com).
Happy exploring and analyzing the world of song lyrics!
|
liuyanchen1015/MULTI_VALUE_cola_acomp_focusing_like | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 10825
num_examples: 125
- name: test
num_bytes: 11817
num_examples: 135
- name: train
num_bytes: 89393
num_examples: 1108
download_size: 54146
dataset_size: 112035
---
# Dataset Card for "MULTI_VALUE_cola_acomp_focusing_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DreadPoor__Kindred-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/Kindred-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/Kindred-7B-slerp](https://huggingface.co/DreadPoor/Kindred-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__Kindred-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-06T04:34:27.779341](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Kindred-7B-slerp/blob/main/results_2024-03-06T04-34-27.779341.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652918276624435,\n\
\ \"acc_stderr\": 0.03217962561512012,\n \"acc_norm\": 0.6527130051694514,\n\
\ \"acc_norm_stderr\": 0.0328482909372264,\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104014,\n \"mc2\": 0.6812302607383641,\n\
\ \"mc2_stderr\": 0.014858561443304524\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.01363134580701619,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7001593308105954,\n\
\ \"acc_stderr\": 0.004572515919210697,\n \"acc_norm\": 0.8778131846245768,\n\
\ \"acc_norm_stderr\": 0.0032683212609136295\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104014,\n \"mc2\": 0.6812302607383641,\n\
\ \"mc2_stderr\": 0.014858561443304524\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838911\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7020470053070508,\n \
\ \"acc_stderr\": 0.01259793223291453\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/Kindred-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|arc:challenge|25_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|gsm8k|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hellaswag|10_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T04-34-27.779341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T04-34-27.779341.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- '**/details_harness|winogrande|5_2024-03-06T04-34-27.779341.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-06T04-34-27.779341.parquet'
- config_name: results
data_files:
- split: 2024_03_06T04_34_27.779341
path:
- results_2024-03-06T04-34-27.779341.parquet
- split: latest
path:
- results_2024-03-06T04-34-27.779341.parquet
---
# Dataset Card for Evaluation run of DreadPoor/Kindred-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/Kindred-7B-slerp](https://huggingface.co/DreadPoor/Kindred-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__Kindred-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-06T04:34:27.779341](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Kindred-7B-slerp/blob/main/results_2024-03-06T04-34-27.779341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652918276624435,
"acc_stderr": 0.03217962561512012,
"acc_norm": 0.6527130051694514,
"acc_norm_stderr": 0.0328482909372264,
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104014,
"mc2": 0.6812302607383641,
"mc2_stderr": 0.014858561443304524
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.01363134580701619,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097225
},
"harness|hellaswag|10": {
"acc": 0.7001593308105954,
"acc_stderr": 0.004572515919210697,
"acc_norm": 0.8778131846245768,
"acc_norm_stderr": 0.0032683212609136295
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715544,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104014,
"mc2": 0.6812302607383641,
"mc2_stderr": 0.014858561443304524
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838911
},
"harness|gsm8k|5": {
"acc": 0.7020470053070508,
"acc_stderr": 0.01259793223291453
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ArmelR/hundred | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: parent_answer_post_id
dtype: int64
- name: prob
dtype: float64
- name: snippet
dtype: string
- name: intent
dtype: string
- name: id
dtype: string
- name: rewritten_intent
dtype: string
splits:
- name: train
num_bytes: 20242
num_examples: 100
- name: test
num_bytes: 1397
num_examples: 7
download_size: 23587
dataset_size: 21639
---
# Dataset Card for "hundred"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ryan-Pupia/CS482-HousingDataSet | ---
dataset_info:
features:
- name: log_stand__housing_median_age
dtype: float64
- name: log_stand__total_rooms
dtype: float64
- name: log_stand__total_bedrooms
dtype: float64
- name: log_stand__population
dtype: float64
- name: log_stand__households
dtype: float64
- name: log_stand__median_income
dtype: float64
- name: log_stand__median_house_value
dtype: float64
- name: encode__ocean_proximity_<1H OCEAN
dtype: float64
- name: encode__ocean_proximity_INLAND
dtype: float64
- name: encode__ocean_proximity_ISLAND
dtype: float64
- name: encode__ocean_proximity_NEAR BAY
dtype: float64
- name: encode__ocean_proximity_NEAR OCEAN
dtype: float64
- name: scale__longitude
dtype: float64
- name: scale__latitude
dtype: float64
splits:
- name: train
num_bytes: 1648864
num_examples: 14722
- name: test
num_bytes: 412272
num_examples: 3681
download_size: 1130408
dataset_size: 2061136
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
language:
- en
pretty_name: Pre-Processed Housing Data
---
This dataset consists of processed and separated data for producing and validating a model using california housing data |
IWant2TryHard/ytcomments | ---
license: mit
---
|
raicrits/news_urls | ---
license: other
---
A collection of about 21k urls of news articles taken from RAI news sites (national and regionals). The file ("urls_train_set.csv") contains around 20k of them reffering to articles published in
the period 01/01/2022 – 09/03/2023 while the file ("urls_test_set.csv") contains urls referring to articles published in the period 10/03/2023 - 04/05/2023.
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_191 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1308756024.0
num_examples: 257022
download_size: 1335935590
dataset_size: 1308756024.0
---
# Dataset Card for "chunk_191"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713096786 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2815459
num_examples: 6945
download_size: 1444421
dataset_size: 2815459
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eb/num2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 130139906
num_examples: 1000000
download_size: 69692182
dataset_size: 130139906
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
akhmedsakip/music-berkeley-emotions | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': L
'1': A
'2': K
'3': D
'4': C
'5': H
'6': F
'7': E
'8': J
'9': G
'10': M
'11': B
'12': I
splits:
- name: train
num_bytes: 225090972.57043225
num_examples: 1392
- name: test
num_bytes: 39838241.765567765
num_examples: 246
download_size: 263271905
dataset_size: 264929214.33600003
---
# Dataset Card for "music-berkeley-emotions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sachith-surge/LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 401407.7971614429
num_examples: 1505
download_size: 212603
dataset_size: 401407.7971614429
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Maitesong | ---
license: openrail
---
|
KnutJaegersberg/longinstruct | ---
license: mit
---
|
GabiRayman/melodea_data_test_main | ---
dataset_info:
features:
- name: Description
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 31774
num_examples: 54
download_size: 21802
dataset_size: 31774
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lincoln/newsquadfr | ---
annotations_creators:
- private
language_creators: null
language:
- fr-FR
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
- newspaper
- online
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
paperswithcode_id: null
---
# Dataset Card for newsquadfr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [lincoln.fr](https://www.lincoln.fr/)
- **Repository:** [github/Lincoln-France](https://github.com/Lincoln-France)
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [email](labinnovation@mel.lincoln.fr)
### Dataset Summary
newsquadfr is a small dataset created for Question Answering task. Contexts are paragraphs of articles extracted from nine online french newspaper during year 2020/2021. newsquadfr stands for Newspaper question answering dataset in french. inspired by Piaf and Squad dataset. 2 520 triplets context - question - answer.
```py
from datasets import load_dataset
ds_name = 'lincoln/newsquadfr'
# exemple 1
ds_newsquad = load_dataset(ds_name)
# exemple 2
data_files = {'train': 'train.json', 'test': 'test.json', 'valid': 'valid.json'}
ds_newsquad = load_dataset(ds_name, data_files=data_files)
# exemple 3
ds_newsquad = load_dataset(ds_name, data_files=data_files, split="valid+test")
```
(train set)
| website | Nb |
|---------------|-----|
| cnews | 20 |
| francetvinfo | 40 |
| la-croix | 375 |
| lefigaro | 160 |
| lemonde | 325 |
| lesnumeriques | 70 |
| numerama | 140 |
| sudouest | 475 |
| usinenouvelle | 45 |
### Supported Tasks and Leaderboards
- extractive-qa
- open-domain-qa
### Languages
Fr-fr
## Dataset Structure
### Data Instances
```json
{'answers': {'answer_start': [53], 'text': ['manSuvre "agressive']},
'article_id': 34138,
'article_title': 'Caricatures, Libye, Haut-Karabakh... Les six dossiers qui '
'opposent Emmanuel Macron et Recep Tayyip Erdogan.',
'article_url': 'https://www.francetvinfo.fr/monde/turquie/caricatures-libye-haut-karabakh-les-six-dossiers-qui-opposent-emmanuel-macron-et-recep-tayyip-erdogan_4155611.html#xtor=RSS-3-[france]',
'context': 'Dans ce contexte déjà tendu, la France a dénoncé une manSuvre '
'"agressive" de la part de frégates turques à l\'encontre de l\'un '
"de ses navires engagés dans une mission de l'Otan, le 10 juin. "
'Selon Paris, la frégate Le Courbet cherchait à identifier un '
'cargo suspecté de transporter des armes vers la Libye quand elle '
'a été illuminée à trois reprises par le radar de conduite de tir '
"de l'escorte turque.",
'id': '2261',
'paragraph_id': 201225,
'question': "Qu'est ce que la France reproche à la Turquie?",
'website': 'francetvinfo'}
```
### Data Fields
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int64` feature.
- `article_id`: a `int64` feature.
- `article_title`: a string feature.
- `article_url`: a string feature.
- `context`: a `string` feature.
- `id`: a `string` feature.
- `paragraph_id`: a `int64` feature.
- `question`: a `string` feature.
- `website`: a `string` feature.
### Data Splits
| Split | Nb |
|-------|----|
| train |1650|
| test |415 |
| valid |455 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
Paragraphs were chosen according to theses rules:
- parent article must have more than 71% ASCII characters
- paragraphs size must be between 170 and 670 characters
- paragraphs shouldn't contain "A LIRE" or "A VOIR AUSSI"
Then, we stratified our original dataset to create this dataset according to :
- website
- number of named entities
- paragraph size
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
Using Piaf annotation tools. Three different persons mostly.
#### Who are the annotators?
Lincoln
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
- Annotation is not well controlled
- asking question on news is biaised
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
https://creativecommons.org/licenses/by-nc-sa/4.0/deed.fr
### Citation Information
[Needs More Information] |
growth-cadet/criteria_validatorspass_eval_mistralfinetuned05_autotrain | ---
dataset_info:
features:
- name: ats
dtype: string
- name: context
dtype: string
- name: sys5_obj
struct:
- name: focus_areas
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: industries
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: products_and_technologies
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: eval_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
- name: eval_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
- name: uuid
dtype: string
- name: prompt
dtype: string
- name: raw_output
dtype: string
- name: pass_pydantic
dtype: int64
- name: pass_eval_embedd
dtype: int64
- name: eval_finetuned_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
- name: eval_finetuned_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
splits:
- name: train
num_bytes: 29402258
num_examples: 2148
download_size: 13743660
dataset_size: 29402258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Marcola1/Marco | ---
license: openrail
---
|
l4zy0n3/stripe_metrics_cyber_monday_23 | ---
license: gpl-3.0
---
|
enoahjr/twitter_dataset_1713147710 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 90104
num_examples: 209
download_size: 29074
dataset_size: 90104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_roneneldan__TinyStories-1M | ---
pretty_name: Evaluation run of roneneldan/TinyStories-1M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-1M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00020973154362416107,\n\
\ \"em_stderr\": 0.00014829481977282063,\n \"f1\": 0.003178481543624158,\n\
\ \"f1_stderr\": 0.0002730192207643319,\n \"acc\": 0.26085240726124703,\n\
\ \"acc_stderr\": 0.007019619608242314\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00020973154362416107,\n \"em_stderr\": 0.00014829481977282063,\n\
\ \"f1\": 0.003178481543624158,\n \"f1_stderr\": 0.0002730192207643319\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n\
\ \"acc_stderr\": 0.014039239216484627\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-1M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- results_2023-07-19T13:25:02.593147.parquet
- split: 2023_09_22T21_41_24.294253
path:
- results_2023-09-22T21-41-24.294253.parquet
- split: latest
path:
- results_2023-09-22T21-41-24.294253.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-1M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-1M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-1M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977282063,
"f1": 0.003178481543624158,
"f1_stderr": 0.0002730192207643319,
"acc": 0.26085240726124703,
"acc_stderr": 0.007019619608242314
},
"harness|drop|3": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977282063,
"f1": 0.003178481543624158,
"f1_stderr": 0.0002730192207643319
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484627
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zzh1997/test | ---
task_categories:
- token-classification
language:
- ch
pretty_name: ss
--- |
H-Huang/complex_word_identification | ---
license: cc-by-4.0
---
source: https://www.inf.uni-hamburg.de/en/inst/ab/lt/resources/data/complex-word-identification-dataset.html |
Yunij/fake_dataset_only | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: string
- name: actual
dtype: int64
- name: feature_embeddings
sequence: float64
splits:
- name: train
num_bytes: 764035610
num_examples: 123287
download_size: 622264333
dataset_size: 764035610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arthurmluz/cstnews_data-wiki_1024_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 50681
num_examples: 16
download_size: 0
dataset_size: 50681
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "cstnews_data-wiki_1024_results"
rouge= {'rouge1': 0.2556160962870545, 'rouge2': 0.10105330295297661, 'rougeL': 0.18616276144956143, 'rougeLsum': 0.18616276144956143}
bert= {'precision': 0.7447284124791622, 'recall': 0.6668070293962955, 'f1': 0.7032350450754166}
mover = 0.5737413741663241 |
Hilalcelik/turkishReviews-ds-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 0
dataset_size: 1392332.0
---
# Dataset Card for "turkishReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mithlesh/Flight_Ticket_Booking_Json | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Knowledge
dtype: string
- name: Response
dtype: string
splits:
- name: train
num_bytes: 2418454
num_examples: 2126
download_size: 109855
dataset_size: 2418454
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepghs/monochrome_danbooru | ---
license: mit
---
This dataset is designed to train models for monochrome image classification, and the data is mainly sourced from danbooru and pixiv.
**All images in this dataset are guaranteed to be in JPEG format with a file extension of `.jpg`**.
Currently, it has three different versions:
| Version | Description | Monochrome Images | Normal Images | Total Images |
|:-----------------------------------------------------------------------------:|:--------------------:|:-----------------:|:-------------:|:------------:|
| [v1.0](https://huggingface.co/datasets/deepghs/monochrome_danbooru/tree/v1.0) | Oldest version | 304 | 174 | 478 |
| [v1.1](https://huggingface.co/datasets/deepghs/monochrome_danbooru/tree/v1.1) | Balanced samples | 528 | 514 | 1042 |
| [v2.0](https://huggingface.co/datasets/deepghs/monochrome_danbooru/tree/v2.0) | More complex samples | 2208 | 2250 | 4458 | |
KoladeOdunope/RLHF_Dataset | ---
dataset_info:
features:
- name: query
dtype: string
- name: rejected
dtype: string
- name: accepted
dtype: string
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 6406390
num_examples: 1029
download_size: 2612881
dataset_size: 6406390
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mouryachinta/yanthraa-mourya-1500 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1750135
num_examples: 1500
download_size: 1077149
dataset_size: 1750135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thegreyhound/products | ---
license: unknown
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43761297
num_examples: 18928
- name: epoch_1
num_bytes: 44345524
num_examples: 18928
- name: epoch_2
num_bytes: 44437506
num_examples: 18928
- name: epoch_3
num_bytes: 44469689
num_examples: 18928
- name: epoch_4
num_bytes: 44474667
num_examples: 18928
- name: epoch_5
num_bytes: 44464460
num_examples: 18928
- name: epoch_6
num_bytes: 44442708
num_examples: 18928
- name: epoch_7
num_bytes: 44422364
num_examples: 18928
- name: epoch_8
num_bytes: 44414794
num_examples: 18928
- name: epoch_9
num_bytes: 44410898
num_examples: 18928
- name: epoch_10
num_bytes: 44411476
num_examples: 18928
- name: epoch_11
num_bytes: 44406870
num_examples: 18928
- name: epoch_12
num_bytes: 44409267
num_examples: 18928
- name: epoch_13
num_bytes: 44409143
num_examples: 18928
- name: epoch_14
num_bytes: 44408111
num_examples: 18928
- name: epoch_15
num_bytes: 44407603
num_examples: 18928
- name: epoch_16
num_bytes: 44408486
num_examples: 18928
- name: epoch_17
num_bytes: 44405857
num_examples: 18928
- name: epoch_18
num_bytes: 44406319
num_examples: 18928
- name: epoch_19
num_bytes: 44406957
num_examples: 18928
- name: epoch_20
num_bytes: 44405910
num_examples: 18928
- name: epoch_21
num_bytes: 44406498
num_examples: 18928
- name: epoch_22
num_bytes: 44406929
num_examples: 18928
- name: epoch_23
num_bytes: 44405194
num_examples: 18928
- name: epoch_24
num_bytes: 44405536
num_examples: 18928
- name: epoch_25
num_bytes: 44405889
num_examples: 18928
- name: epoch_26
num_bytes: 44404896
num_examples: 18928
- name: epoch_27
num_bytes: 44404886
num_examples: 18928
- name: epoch_28
num_bytes: 44406038
num_examples: 18928
- name: epoch_29
num_bytes: 44407507
num_examples: 18928
download_size: 700201551
dataset_size: 1331783279
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
pszemraj/bookcorpus_deduplicated-formatted | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
source_datasets:
- saibo/bookcorpus_deduplicated
- bookcorpus
---
# bookcorpus_deduplicated-formatted
Based on `saibo/bookcorpus_deduplicated`, an attempt was made to fix , perhaps , issues like this where the whitespace is nonsensical .
## Why This Matters
Spaces are like the unsung heroes of readable content . They maintain order , balance, and rhythm . But sometimes, they just want to rebel and go where they're not needed . Just like cats sitting where they shouldn't . Like, why are you on the fridge , Mr.Whiskers ?
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/7bd33ae8 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1336
dataset_size: 184
---
# Dataset Card for "7bd33ae8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora | ---
pretty_name: Evaluation run of alignment-handbook/zephyr-7b-dpo-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alignment-handbook/zephyr-7b-dpo-qlora](https://huggingface.co/alignment-handbook/zephyr-7b-dpo-qlora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T21:27:47.387655](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora/blob/main/results_2024-01-26T21-27-47.387655.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369766338808557,\n\
\ \"acc_stderr\": 0.03238152491968989,\n \"acc_norm\": 0.641831921094304,\n\
\ \"acc_norm_stderr\": 0.033030780304730514,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.4714491532888518,\n\
\ \"mc2_stderr\": 0.014683410665396914\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938217,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.658334993029277,\n\
\ \"acc_stderr\": 0.004732986187325878,\n \"acc_norm\": 0.8535152360087632,\n\
\ \"acc_norm_stderr\": 0.0035286889976580533\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.0252798503974049,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.0252798503974049\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266878,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266878\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.012725701656953638,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.012725701656953638\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.4714491532888518,\n\
\ \"mc2_stderr\": 0.014683410665396914\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42077331311599697,\n \
\ \"acc_stderr\": 0.013598489497182838\n }\n}\n```"
repo_url: https://huggingface.co/alignment-handbook/zephyr-7b-dpo-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|arc:challenge|25_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|gsm8k|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hellaswag|10_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-27-47.387655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T21-27-47.387655.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- '**/details_harness|winogrande|5_2024-01-26T21-27-47.387655.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T21-27-47.387655.parquet'
- config_name: results
data_files:
- split: 2024_01_26T21_27_47.387655
path:
- results_2024-01-26T21-27-47.387655.parquet
- split: latest
path:
- results_2024-01-26T21-27-47.387655.parquet
---
# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-dpo-qlora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alignment-handbook/zephyr-7b-dpo-qlora](https://huggingface.co/alignment-handbook/zephyr-7b-dpo-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T21:27:47.387655](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora/blob/main/results_2024-01-26T21-27-47.387655.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6369766338808557,
"acc_stderr": 0.03238152491968989,
"acc_norm": 0.641831921094304,
"acc_norm_stderr": 0.033030780304730514,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.4714491532888518,
"mc2_stderr": 0.014683410665396914
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938217,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.658334993029277,
"acc_stderr": 0.004732986187325878,
"acc_norm": 0.8535152360087632,
"acc_norm_stderr": 0.0035286889976580533
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.0252798503974049,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.0252798503974049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266878,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973133,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.012725701656953638,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.012725701656953638
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.4714491532888518,
"mc2_stderr": 0.014683410665396914
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.42077331311599697,
"acc_stderr": 0.013598489497182838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EgilKarlsen/Spirit_RoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211789418
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_RoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangyue/test.one | ---
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
# Dataset Card for "test.one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.