datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
polymath707/indollama2 | ---
license: apache-2.0
--- |
scielo | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
- es
- pt
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: SciELO
dataset_info:
- config_name: en-es
features:
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: train
num_bytes: 71777213
num_examples: 177782
download_size: 22965217
dataset_size: 71777213
- config_name: en-pt
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 1032669686
num_examples: 2828917
download_size: 322726075
dataset_size: 1032669686
- config_name: en-pt-es
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
- es
splits:
- name: train
num_bytes: 147472132
num_examples: 255915
download_size: 45556562
dataset_size: 147472132
config_names:
- en-es
- en-pt
- en-pt-es
---
# Dataset Card for SciELO
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[SciELO](https://sites.google.com/view/felipe-soares/datasets#h.p_92uSCyAjWSRB)
- **Repository:**
- **Paper:** [A Large Parallel Corpus of Full-Text Scientific Articles](https://arxiv.org/abs/1905.01852)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A parallel corpus of full-text scientific articles collected from Scielo database in the following languages:English, Portuguese and Spanish.
The corpus is sentence aligned for all language pairs, as well as trilingual aligned for a small subset of sentences.
Alignment was carried out using the Hunalign algorithm.
### Supported Tasks and Leaderboards
The underlying task is machine translation.
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{soares2018large,
title={A Large Parallel Corpus of Full-Text Scientific Articles},
author={Soares, Felipe and Moreira, Viviane and Becker, Karin},
booktitle={Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC-2018)},
year={2018}
}
```
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
mozart-ai/info-qa | ---
dataset_info:
features:
- name: url
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 103523
num_examples: 612
download_size: 28661
dataset_size: 103523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "info-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ARDT-Project/arrl_nrmdp_train_halfcheetah | ---
dataset_info:
features:
- name: observations
sequence:
sequence: float64
- name: pr_actions
sequence:
sequence: float64
- name: adv_actions
sequence:
sequence: float64
- name: rewards
sequence: float64
- name: dones
sequence: bool
splits:
- name: train
num_bytes: 503785750
num_examples: 2000
download_size: 344355608
dataset_size: 503785750
---
# Dataset Card for "arrl_nrmdp_train_halfcheetah_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxm/private_prompts_2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: value
dtype: string
- name: field
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 30972934
num_examples: 251270
download_size: 8631699
dataset_size: 30972934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "private_prompts_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kuokxuen/marketing_dataset | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 197836
num_examples: 100
download_size: 120464
dataset_size: 197836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jessica-ecosia/gpdr-dpr-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: embeddings
sequence:
sequence: float64
splits:
- name: train
num_bytes: 4191740
num_examples: 620
download_size: 0
dataset_size: 4191740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpdr-dpr-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/sen-making | ---
task_categories:
- text-classification
- multiple-choice
language:
- en
tags:
- explanation
---
https://github.com/wangcunxiang/Sen-Making-and-Explanation
```
@inproceedings{wang-etal-2019-make,
title = "Does it Make Sense? And Why? A Pilot Study for Sense Making and Explanation",
author = "Wang, Cunxiang and
Liang, Shuailong and
Zhang, Yue and
Li, Xiaonan and
Gao, Tian",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1393",
pages = "4020--4026",
abstract = "Introducing common sense to natural language understanding systems has received increasing research attention. It remains a fundamental question on how to evaluate whether a system has the sense-making capability. Existing benchmarks measure common sense knowledge indirectly or without reasoning. In this paper, we release a benchmark to directly test whether a system can differentiate natural language statements that make sense from those that do not make sense. In addition, a system is asked to identify the most crucial reason why a statement does not make sense. We evaluate models trained over large-scale language modeling tasks as well as human performance, showing that there are different challenges for system sense-making.",
}
``` |
Azure99/blossom-chat-v2 | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- zh
- en
size_categories:
- 10K<n<100K
---
# BLOSSOM CHAT V2
### 介绍
Blossom Chat V2是基于ShareGPT 90K衍生而来的中英双语对话数据集,适用于多轮对话微调。
相比于blossom-chat-v1,进一步优化了数据处理流程,并配平了中英语料。
本数据集抽取了ShareGPT的多轮对话指令,仅将指令进行翻译,随后使用多轮指令迭代调用gpt-3.5-turbo-0613。
相比原始的ShareGPT数据,主要解决了中文对话数据量较少,以及由ChatGPT生成长度限制而导致的输出截断问题。
本次发布了全量数据的20%,包含30K记录。
### 语言
以中文和英文为主,中英文数据按照约1:1的比例混合。
### 数据集结构
每条数据代表一个完整的多轮对话,包含id和conversations两个字段。
- id:字符串,代表原始ShareGPT的对话id,可以通过链接https://sharegpt.com/c/id来访问原始对话。
- conversations:对象数组,每个对象包含role、content两个字段,role的取值为user或assistant,分别代表用户输入和助手输出,content则为对应的内容。
### 数据集限制
由于仅抽取了原始多轮对话的输入,对于一些涉及随机性的对话,例如:猜随机数,可能会出现多轮对话不连贯的情况。
此外,本数据集的所有响应均由gpt-3.5-turbo-0613生成,并未经过严格的数据校验,可能包含不准确甚至严重错误的回答。 |
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-multilang-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T21:24:46.417181](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b/blob/main/results_2023-10-21T21-24-46.417181.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003355704697986577,\n\
\ \"em_stderr\": 0.0005922452850005271,\n \"f1\": 0.056043414429530265,\n\
\ \"f1_stderr\": 0.0013596034176909157,\n \"acc\": 0.3531399801217468,\n\
\ \"acc_stderr\": 0.008551128750555435\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003355704697986577,\n \"em_stderr\": 0.0005922452850005271,\n\
\ \"f1\": 0.056043414429530265,\n \"f1_stderr\": 0.0013596034176909157\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.021986353297952996,\n \
\ \"acc_stderr\": 0.004039162758110061\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T21_24_46.417181
path:
- '**/details_harness|drop|3_2023-10-21T21-24-46.417181.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T21-24-46.417181.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T21_24_46.417181
path:
- '**/details_harness|gsm8k|5_2023-10-21T21-24-46.417181.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T21-24-46.417181.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:26:27.370097.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:26:27.370097.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T21_24_46.417181
path:
- '**/details_harness|winogrande|5_2023-10-21T21-24-46.417181.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T21-24-46.417181.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_26_27.370097
path:
- results_2023-07-19T21:26:27.370097.parquet
- split: 2023_10_21T21_24_46.417181
path:
- results_2023-10-21T21-24-46.417181.parquet
- split: latest
path:
- results_2023-10-21T21-24-46.417181.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-multilang-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T21:24:46.417181](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b/blob/main/results_2023-10-21T21-24-46.417181.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005271,
"f1": 0.056043414429530265,
"f1_stderr": 0.0013596034176909157,
"acc": 0.3531399801217468,
"acc_stderr": 0.008551128750555435
},
"harness|drop|3": {
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005271,
"f1": 0.056043414429530265,
"f1_stderr": 0.0013596034176909157
},
"harness|gsm8k|5": {
"acc": 0.021986353297952996,
"acc_stderr": 0.004039162758110061
},
"harness|winogrande|5": {
"acc": 0.6842936069455406,
"acc_stderr": 0.01306309474300081
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
huggingartists/shadowraze | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/shadowraze"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.063932 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/e2576b95c2049862de20cbd0f1a4e0d7.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/shadowraze">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">shadowraze</div>
<a href="https://genius.com/artists/shadowraze">
<div style="text-align: center; font-size: 14px;">@shadowraze</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/shadowraze).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/shadowraze")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|14| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/shadowraze")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
FarmerlineML/baoule_dataset_3 | ---
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 111429663.09
num_examples: 1062
- name: test
num_bytes: 15991997.0
num_examples: 198
download_size: 128902960
dataset_size: 127421660.09
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Harshvardhan27/TLDR_Fine_Tuned_Mistral_Final_Model | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 536417
num_examples: 1000
- name: test
num_bytes: 106940
num_examples: 200
download_size: 427413
dataset_size: 643357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AlekseyKorshuk/davinci-pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 3835512036
num_examples: 143908
download_size: 800758913
dataset_size: 3835512036
---
# Dataset Card for "davinci-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lionelchg/guanaco-llama2-2k | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3212963
num_examples: 2000
download_size: 1887828
dataset_size: 3212963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card
This is an 2000 examples extract of https://huggingface.co/datasets/timdettmers/openassistant-guanaco |
Doctor-Shotgun/theory-of-mind-dpo | ---
language:
- en
---
This is [grimulkan/theory-of-mind](https://huggingface.co/datasets/grimulkan/theory-of-mind) with "rejected" responses generated using [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), and the file formatted for use in DPO training.
The code used to generate the dataset can be found in this repository: https://github.com/DocShotgun/LLM-datagen |
AdapterOcean/gorilla_16k_standardized_cluster_1_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3607783
num_examples: 8302
download_size: 0
dataset_size: 3607783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gorilla_16k_standardized_cluster_1_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hatakeyama-llm-team/japanese2010 | ---
language:
- ja
---
# 日本語ウェブコーパス2010
- [こちら](https://www.s-yata.jp/corpus/nwc2010/)のデータをhuggingfaceにアップロードしたものです。
- 2009 年度における著作権法の改正(平成21年通常国会 著作権法改正等について | 文化庁)に基づき,情報解析研究への利用に限って利用可能です。
- 形態素解析を用いて、自動で句点をつけました。
- 変換コード
- [変換スクリプト](./load_jap.py)
- [形態素解析など](./Touten.py) |
wenhanhan/FEVER_test | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6001705
num_examples: 9999
download_size: 1962743
dataset_size: 6001705
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FEVER_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mj96/subject_lionel_messi | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1136397.0
num_examples: 14
download_size: 1137829
dataset_size: 1136397.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Barry30/lishiqing | ---
license: apache-2.0
---
|
ekazuki/text_to_french_parliament_group_debates | ---
dataset_info:
features:
- name: text
dtype: string
- name: group
dtype: string
splits:
- name: train
num_bytes: 93969142.4
num_examples: 85328
- name: test
num_bytes: 23492285.6
num_examples: 21332
download_size: 65890041
dataset_size: 117461428.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
peterkros/COFOG-feedback | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
|
thanhpn/iapp_wiki_qa_squad_oa | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 1150840
num_examples: 5761
download_size: 437412
dataset_size: 1150840
---
# Dataset Card for "iapp_wiki_qa_squad_oa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-jeffdshen__neqa0_8shot-jeffdshen__neqa0_8shot-5a61bc-1852963391 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/neqa0_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-125m_eval
metrics: []
dataset_name: jeffdshen/neqa0_8shot
dataset_config: jeffdshen--neqa0_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-125m_eval
* Dataset: jeffdshen/neqa0_8shot
* Config: jeffdshen--neqa0_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
open-llm-leaderboard/details_google__gemma-7b | ---
pretty_name: Evaluation run of google/gemma-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [google/gemma-7b](https://huggingface.co/google/gemma-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_google__gemma-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T18:01:00.586646](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-7b/blob/main/results_2024-02-23T18-01-00.586646.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6580452433778683,\n\
\ \"acc_stderr\": 0.03198812334565303,\n \"acc_norm\": 0.662225563457007,\n\
\ \"acc_norm_stderr\": 0.03262216078960403,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4490548840372056,\n\
\ \"mc2_stderr\": 0.014654652028381131\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.622087233618801,\n\
\ \"acc_stderr\": 0.0048387473057833474,\n \"acc_norm\": 0.8247361083449513,\n\
\ \"acc_norm_stderr\": 0.0037941565512722643\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \
\ \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.030588697013783642,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.030588697013783642\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.0402614149763461,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.0402614149763461\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867433,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867433\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.01318222261672089,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.01318222261672089\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.016407123032195253,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.016407123032195253\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340866,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472658,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472658\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399663,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399663\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4490548840372056,\n\
\ \"mc2_stderr\": 0.014654652028381131\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5276724791508719,\n \
\ \"acc_stderr\": 0.013751375538801323\n }\n}\n```"
repo_url: https://huggingface.co/google/gemma-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|arc:challenge|25_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|arc:challenge|25_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|gsm8k|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|gsm8k|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hellaswag|10_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hellaswag|10_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T08-54-11.990054.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T18-01-00.586646.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T18-01-00.586646.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- '**/details_harness|winogrande|5_2024-02-16T08-54-11.990054.parquet'
- split: 2024_02_23T18_01_00.586646
path:
- '**/details_harness|winogrande|5_2024-02-23T18-01-00.586646.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T18-01-00.586646.parquet'
- config_name: results
data_files:
- split: 2024_02_16T08_54_11.990054
path:
- results_2024-02-16T08-54-11.990054.parquet
- split: 2024_02_23T18_01_00.586646
path:
- results_2024-02-23T18-01-00.586646.parquet
- split: latest
path:
- results_2024-02-23T18-01-00.586646.parquet
---
# Dataset Card for Evaluation run of google/gemma-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [google/gemma-7b](https://huggingface.co/google/gemma-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_google__gemma-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T18:01:00.586646](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-7b/blob/main/results_2024-02-23T18-01-00.586646.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6580452433778683,
"acc_stderr": 0.03198812334565303,
"acc_norm": 0.662225563457007,
"acc_norm_stderr": 0.03262216078960403,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4490548840372056,
"mc2_stderr": 0.014654652028381131
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.622087233618801,
"acc_stderr": 0.0048387473057833474,
"acc_norm": 0.8247361083449513,
"acc_norm_stderr": 0.0037941565512722643
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.030588697013783642,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.030588697013783642
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.0402614149763461,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.0402614149763461
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867433,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867433
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.01318222261672089,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.01318222261672089
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.016407123032195253,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.016407123032195253
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340866,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472658,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472658
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399663,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399663
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4490548840372056,
"mc2_stderr": 0.014654652028381131
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5276724791508719,
"acc_stderr": 0.013751375538801323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sajjo/bangala_data_v2 | ---
dataset_info:
features:
- name: path
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 208149
num_examples: 1034
download_size: 83974
dataset_size: 208149
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_35_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 17717791
num_examples: 11770
download_size: 9015332
dataset_size: 17717791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_35_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/multi-modal | ---
dataset_info:
features:
- name: content
dtype: string
id: field
- name: description
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: description-suggestion
dtype: string
id: suggestion
- name: description-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: quality
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: int32
id: suggestion
- name: status
dtype: string
id: question
- name: quality-suggestion
dtype: int32
id: suggestion
- name: quality-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: age_group
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: age_group-suggestion
dtype: string
id: suggestion
- name: age_group-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 76240752
num_examples: 60
download_size: 0
dataset_size: 76240752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multi-modal"
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla) or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
Argilla supports Markdown within its text fields. This means you can easily add formatting like **bold** and *italic* text, [links](https://www.google.com), and even insert HTML elements like images, audios, videos, and iframes.
A multi-modal dataset can be used to create a dataset with text and different types of media content. It can be useful for different tasks, such as image captioning, video captioning, audio captioning, and so on.
So, this is a multi-modal dataset example that uses three different datasets from Hugging Face:
* **Video**: We use an action recognition dataset, the [ucf101-subset](https://huggingface.co/datasets/sayakpaul/ucf101-subset) from the [UCF101](https://www.crcv.ucf.edu/data/UCF101.php). This dataset contains realistic action videos from YouTube, classified in 101 actions.
* **Audio**: We use an audio classification dataset, the [ccmusic-database/bel_folk](https://huggingface.co/datasets/ccmusic-database/bel_folk). This dataset contains 1 minute audio clips of Chinese folk music, and the genre of the music.
* **Image**: We use an image classification dataset, the [zishuod/pokemon-icons](https://huggingface.co/datasets/zishuod/pokemon-icons). This dataset contains images of Pokemon that need to be classified.
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/multi-modal")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/multi-modal")
```
### Supported Tasks
- Multi-modal classification
- Multi-modal transcription
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | Label | label_selection | True | N/A | ['World', 'Sports', 'Business', 'Sci/Tech'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
**✨ NEW** The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
#### Data in "multi-modal" Dataset
* **Fields:** These are the records, each of them is a video, audio or image file encoded in base64.
* **text** is of type `text`.
* **Questions:** These are the questions that should be annotated.
* **TextQuestion** is a feature to describe the content in detail.
* **RatingQuestion** will allow us to rate the content's quality effectively.
* **LabelQuestion** is for tagging the content with the most suitable age group.
* **Metadata:** Three metadata properties are added to streamline content management.
* **groups** is to identify the assigned annotator group.
* **media** will specify the media source.
* **source-dataset** will highlight the source dataset of the content in each record.
### Data Splits
The dataset contains a single split, which is `train`. |
liuyanchen1015/MULTI_VALUE_wnli_double_comparative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 526
num_examples: 3
- name: test
num_bytes: 2144
num_examples: 9
- name: train
num_bytes: 6074
num_examples: 33
download_size: 11799
dataset_size: 8744
---
# Dataset Card for "MULTI_VALUE_wnli_double_comparative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edumunozsala/Bactrian-X-es | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 52741859
num_examples: 67017
download_size: 31116069
dataset_size: 52741859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kjappelbaum/chemnlp-chemdner | ---
dataset_info:
features:
- name: entities
sequence: string
- name: text
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 14376666
num_examples: 19440
download_size: 8033115
dataset_size: 14376666
---
# Dataset Card for "chemnlp-chemdner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NaturalStupidlty/FinBERT-Twitter-BTC | ---
license: apache-2.0
---
|
HydraIndicLM/hindi_alpaca_dolly_67k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: id
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 212592103
num_examples: 67017
download_size: 80604522
dataset_size: 212592103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## About
This repo contains a 67K instruction set for Hindi, translated from Alpaca and Dolly.
## Citation
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{HindiAlpacaDolly,
author = {Sambit Sekhar and Shantipriya Parida},
title = {Hindi Instruction Set Based on Alpaca and Dolly},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
|
5CD-AI/Vietnamese-ComplexWebQuestions-gg-translated | ---
task_categories:
- question-answering
language:
- en
- vi
size_categories:
- 10K<n<100K
--- |
hanyarammah/hhhhh | ---
license: unknown
---
|
update0909/ApolloAuto-zyx-apollo | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 152781948
num_examples: 17194
download_size: 46566010
dataset_size: 152781948
---
# Dataset Card for "ApolloAuto-zyx-apollo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sabbyanandan/fooz | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_4season__alignment-model-test10 | ---
pretty_name: Evaluation run of 4season/alignment-model-test10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [4season/alignment-model-test10](https://huggingface.co/4season/alignment-model-test10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_4season__alignment-model-test10\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T08:12:37.264622](https://huggingface.co/datasets/open-llm-leaderboard/details_4season__alignment-model-test10/blob/main/results_2024-04-09T08-12-37.264622.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6830251707414831,\n\
\ \"acc_stderr\": 0.03150837150158549,\n \"acc_norm\": 0.6842989605978566,\n\
\ \"acc_norm_stderr\": 0.032158515186000075,\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.710829343782068,\n\
\ \"mc2_stderr\": 0.014802276642222825\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7738907849829352,\n \"acc_stderr\": 0.012224202097063276,\n\
\ \"acc_norm\": 0.7960750853242321,\n \"acc_norm_stderr\": 0.01177426247870226\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7762397928699463,\n\
\ \"acc_stderr\": 0.004159114679873824,\n \"acc_norm\": 0.9001194981079467,\n\
\ \"acc_norm_stderr\": 0.002992278134932447\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724067,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6425531914893617,\n \"acc_stderr\": 0.031329417894764254,\n\
\ \"acc_norm\": 0.6425531914893617,\n \"acc_norm_stderr\": 0.031329417894764254\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"\
acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514583,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503564,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503564\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.023400928918310485,\n\
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.023400928918310485\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568627,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568627\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.016657229424586306,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.016657229424586306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02380518652488814,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02380518652488814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543346,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543346\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5354609929078015,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4973924380704042,\n\
\ \"acc_stderr\": 0.012770062445433172,\n \"acc_norm\": 0.4973924380704042,\n\
\ \"acc_norm_stderr\": 0.012770062445433172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352817,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352817\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.02721283588407316,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.02721283588407316\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.710829343782068,\n\
\ \"mc2_stderr\": 0.014802276642222825\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8721389108129439,\n \"acc_stderr\": 0.009385235583937262\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5648218347232752,\n \
\ \"acc_stderr\": 0.013656253875470738\n }\n}\n```"
repo_url: https://huggingface.co/4season/alignment-model-test10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-37.264622.parquet'
- config_name: results
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- results_2024-04-09T08-12-09.669210.parquet
- split: 2024_04_09T08_12_37.264622
path:
- results_2024-04-09T08-12-37.264622.parquet
- split: latest
path:
- results_2024-04-09T08-12-37.264622.parquet
---
# Dataset Card for Evaluation run of 4season/alignment-model-test10
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [4season/alignment-model-test10](https://huggingface.co/4season/alignment-model-test10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_4season__alignment-model-test10",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T08:12:37.264622](https://huggingface.co/datasets/open-llm-leaderboard/details_4season__alignment-model-test10/blob/main/results_2024-04-09T08-12-37.264622.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6830251707414831,
"acc_stderr": 0.03150837150158549,
"acc_norm": 0.6842989605978566,
"acc_norm_stderr": 0.032158515186000075,
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.710829343782068,
"mc2_stderr": 0.014802276642222825
},
"harness|arc:challenge|25": {
"acc": 0.7738907849829352,
"acc_stderr": 0.012224202097063276,
"acc_norm": 0.7960750853242321,
"acc_norm_stderr": 0.01177426247870226
},
"harness|hellaswag|10": {
"acc": 0.7762397928699463,
"acc_stderr": 0.004159114679873824,
"acc_norm": 0.9001194981079467,
"acc_norm_stderr": 0.002992278134932447
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802269,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802269
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.034765996075164785,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.034765996075164785
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6425531914893617,
"acc_stderr": 0.031329417894764254,
"acc_norm": 0.6425531914893617,
"acc_norm_stderr": 0.031329417894764254
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604548,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604548
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514583,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503564,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503564
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.023400928918310485,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.023400928918310485
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568627,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568627
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.016657229424586306,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.016657229424586306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02380518652488814,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02380518652488814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4973924380704042,
"acc_stderr": 0.012770062445433172,
"acc_norm": 0.4973924380704042,
"acc_norm_stderr": 0.012770062445433172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352817,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352817
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.02721283588407316,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.02721283588407316
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.710829343782068,
"mc2_stderr": 0.014802276642222825
},
"harness|winogrande|5": {
"acc": 0.8721389108129439,
"acc_stderr": 0.009385235583937262
},
"harness|gsm8k|5": {
"acc": 0.5648218347232752,
"acc_stderr": 0.013656253875470738
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
clonandovoz/clonandovoz | ---
license: openrail
---
|
EleutherAI/muInstruct | ---
license: apache-2.0
task_categories:
- text2text-generation
language:
- en
tags:
- math
size_categories:
- 1K<n<10K
---
**μInstruct** is a dataset of 1600 instruction-response pairs collected from highly-rated Stack Exchange answers, the Khan Academy subset of [AMPS](https://github.com/hendrycks/math), and the [MATH](https://huggingface.co/datasets/hendrycks/competition_math) training set. All training examples are valid Markdown have been manually reviewed by a human for quality.
The μInstruct dataset is most useful when mixed in with larger instruction or chat datasets, such as [OpenHermes](https://huggingface.co/datasets/teknium/OpenHermes-2.5). Because μInstruct is especially high-quality, you may consider oversampling it in your training mixture.
μInstruct was used to train [`llemma_7b_muinstruct_camelmath`](https://huggingface.co/EleutherAI/llemma_7b_muinstruct_camelmath). |
pphuc25/VLSP_T1 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 870843590.0
num_examples: 7500
download_size: 862653100
dataset_size: 870843590.0
---
# Dataset Card for "VLSP_T1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rish16/cs4243-database-dict | ---
license: mit
---
|
wckwan/M4LE | ---
license: mit
task_categories:
- question-answering
- translation
- summarization
- text-classification
- text-retrieval
language:
- en
- zh
tags:
- Long Context
size_categories:
- 1K<n<10K
---
## Introduction
**M4LE** is a **M**ulti-ability, **M**ulti-range, **M**ulti-task, bilingual benchmark for long-context evaluation. We categorize long-context understanding into five distinct abilities by considering whether it is required to identify single or multiple spans in long contexts based on explicit or semantic hints. Specifically, these abilities are explicit single-span, semantic single-span, explicit multiple-span, semantic multiple-span, and global. Different from previous long-context benchmark that simply compile from a set of existing long NLP benchmarks, we introduce an automated method to transform short-sequence tasks into a comprehensive long-sequence scenario encompassing all these capabilities.
M4LE consists of 36 tasks, covering 11 task types and 12 domains. For each task, we construct 200 instances for each context length bucket (1K, 2K, 4K, 6K, 8K, 12K, 16K, 24K, 32K). Due to computation and cost constraints, our paper evaluated 11 well-established LLMs on instances up to the 8K context length bucket. For more details, please refer to the paper available at <https://arxiv.org/abs/2310.19240>. You can also explore the Github page at <https://github.com/KwanWaiChung/M4LE>.
## Usage
You can load the dataset by specifying the task name:
```python
from datasets import load_dataset
tasks = [
"arxiv",
"bigpatent_global_cls",
"bigpatent_global_sum",
"booksum",
"c3",
"cepsum",
"clts+",
"cnewsum",
"cnnnews",
"drcd_explicit-single",
"drcd_semantic-single",
"duorc",
"dureader",
"hotpotqa",
"lcsts",
"marc",
"mnds-news_explicit-single",
"mnds-news_explicit-multiple",
"mnds-news_semantic-multiple",
"ncls",
"news-commentary-en2zh",
"news-commentary-zh2en",
"news2016",
"newsqa",
"nq-open",
"online-shopping",
"open-subtitles-en2zh",
"open-subtitles-zh2en",
"pubmed",
"tedtalks-en2zh",
"tedtalks-zh2en",
"thucnews_explicit-single",
"thucnews_explicit-multiple",
"thucnews_semantic-multiple",
"triviaqa",
"wiki2019zh",
"wikihow",
"wikitext-103",
"wow",
]
for task in tasks:
data = load_dataset('wckwan/M4LE', task, split='test')
```
## Format
Each testing instance follows this format:
```yaml
{
"instruction": "<task description>",
"input": "<task input with one-shot example>",
"answers": ["<answer1>", "<answer2>"],
"input_length": <int, number of words in instruction and input separated by space>,
"total_length": <int, number of words in instruction, input and gold answer separated by space>,
"length_bucket": <int, the length bucket to which this instance belongs>
}
```
## Tasks
Here is the full list for the tasks with their descriptions. More details about these tasks, please refer to the paper .
Ability | Task Name | Task Type | Language | Description
----------------- | ------------------------------------------- | ---------- | -------- | ------------------------------------------------------------------
Explicit Single | mnds-news_explicit-single | CLS + RET | En | Classify a specified news article.
Explicit Single | thucnews_explicit-single | CLS + RET | Zh | Classify a specified news article.
Explicit Single | newsqa | QA + RET | En | Answer a question based on a specified news article.
Explicit Single | c3 | QA + RET | Zh | Answer a multi-choice question based on a textbook extract.
Explicit Single | wow | RET | En | Return the ID of the article related to a specified topic.
Explicit Single | drcd_explicit-single | RET | Zh | Return the ID of the article related to a specified topic.
Explicit Single | cnnnews | SUM + RET | En | Summarize a specified news article.
Explicit Single | cepsum | SUM + RET | Zh | Summarize a specified product description.
Explicit Single | lcsts | SUM + RET | Zh | Summarize a specified news article.
Explicit Single | ncls | SUM + RET | En, Zh | Summarize a specified news article.
Explicit Multiple | mnds-news_explicit-multiple | CLS + RET | En | Return the IDs of all the articles belong to a specified class.
Explicit Multiple | thucnews_explicit-multiple | CLS + RET | Zh | Return the IDs of all the articles belong to a specified class.
Explicit Multiple | marc | CLS + RET | En, Zh | Return the IDs of all the positive product reviews.
Explicit Multiple | online-shopping | CLS + RET | Zh | Return the IDs of all the positive product reviews.
Semantic Single | wikitext-103 | NLI + RET | En | Return the ID of the paragraph that continues a query paragraph.
Semantic Single | wiki2019zh | NLI + RET | Zh | Return the ID of the paragraph that continues a query paragraph.
Semantic Single | duorc | QA | En | Answer a question based on multiple movie plots.
Semantic Single | nq-open | QA | En | Answer a question based on multiple wikipedia paragraphs.
Semantic Single | dureader | QA | Zh | Answer a question based on multiple web snippets.
Semantic Single | drcd_semantic-single | QA | Zh | Answer a question based on multiple wikipedia paragraphs.
Semantic Single | wikihow | SUM + RET | En | Summarize an article based on a given topic.
Semantic Single | news2016 | SUM + RET | Zh | Summarize a news article based on a given title.
Semantic Single | tedtalks-en2zh/tedtalks-zh2en | TRAN + RET | En, Zh | Translate a Ted Talk transcript based on a given title.
Semantic Multiple | mnds-news_semantic-multiple | CLS + CNT | En | Return the number of news articles belonging to a specified class.
Semantic Multiple | thucnews_semantic-multiple | CLS + CNT | Zh | Return the number of news articles belonging to a specified class.
Semantic Multiple | hotpotqa | QA | En | Answer a question based on multiple wikipedia paragraphs.
Global | bigpatent_global_cls | CLS | En | Classify a patent document.
Global | triviaqa | QA | En | Answer a question based on a web snippet.
Global | arxiv | SUM | En | Summarize an academic paper.
Global | bigpatent_global_sum | SUM | En | Summarize a patent document.
Global | pubmed | SUM | En | Summarize a medical paper.
Global | booksum | SUM | En | Summarize one or more chapters of a book.
Global | cnewsum | SUM | Zh | Summarize a news article.
Global | clts+ | SUM | Zh | Summarize a news article.
Global | open-subtitles-en2zh/open-subtitles-zh2en | TRAN | En, Zh | Translate the movie subtitles.
Global | news-commentary-en2zh/news-commentary-zh2en | TRAN | En, Zh | Translate the movie subtitles.
## Citation
If you find our paper and resources useful, please consider citing our paper:
```bibtex
@misc{kwan_m4le_2023,
title = {{{M4LE}}: {{A Multi-Ability Multi-Range Multi-Task Multi-Domain Long-Context Evaluation Benchmark}} for {{Large Language Models}}},
author = {Kwan, Wai-Chung and Zeng, Xingshan and Wang, Yufei and Sun, Yusen and Li, Liangyou and Shang, Lifeng and Liu, Qun and Wong, Kam-Fai},
year = {2023},
}
```
|
songlab/gpn-msa-sapiens-dataset | ---
license: mit
tags:
- dna
- biology
- genomics
---
# Training windows for GPN-MSA-Sapiens
For more information check out our [paper](https://doi.org/10.1101/2023.10.10.561776) and [repository](https://github.com/songlab-cal/gpn).
Path in Snakemake:
`results/dataset/multiz100way/89/128/64/True/defined.phastCons.percentile-75_0.05_0.001` |
mstz/breast | ---
language:
- en
tags:
- breast
- tabular_classification
- binary_classification
- UCI
pretty_name: Breast
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- cancer
license: cc
---
# Breast cancer
The [Breast cancer dataset](https://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+%28Original%29) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Classify cancerousness of the given cell.
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|---------------------------------------------------------------|
| cancer | Binary classification | Is the cell clump cancerous? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/breast", "cancer")["train"]
```
# Features
| **Name** |**Type**|**Description** |
|-------------------------------|--------|----------------------------|
|`clump_thickness` |`int8` |Thickness of the clump |
|`uniformity_of_cell_size` |`int8` |Uniformity of cell size |
|`uniformity_of_cell_shape` |`int8` |Uniformity of cell shape |
|`marginal_adhesion` |`int8` |Marginal adhesion |
|`single_epithelial_cell_size` |`int8` |single_epithelial_cell_size |
|`bare_nuclei` |`int8` |bare_nuclei |
|`bland_chromatin` |`int8` |bland_chromatin |
|`normal_nucleoli` |`int8` |normal_nucleoli |
|`mitoses` |`int8` |mitoses |
|**is_cancer** |`int8` |Is the clump cancer | |
joey234/mmlu-high_school_microeconomics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 6787
num_examples: 5
- name: test
num_bytes: 1900343
num_examples: 238
download_size: 209220
dataset_size: 1907130
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_microeconomics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
riogerz/florz | ---
license: openrail
---
|
polymath707/aseanllama2-without-emojis | ---
license: apache-2.0
---
|
Vecinito87/SD_IMG_POOL | ---
license: unknown
---
|
dustalov/pierogue | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Pierogue
size_categories:
- n<1K
source_datasets:
- original
tags:
- cosmos
- nature
- music
- technology
- fashion
- education
- qrels
- queries
- documents
task_categories:
- text-retrieval
- feature-extraction
- text-generation
task_ids:
- document-retrieval
- language-modeling
dataset_info:
- config_name: documents
features:
- name: document_id
dtype: int8
- name: topic
dtype:
class_label:
names:
'0': cosmos
'1': nature
'2': music
'3': technology
'4': fashion
- name: text
dtype: string
splits:
- name: train
num_bytes: 8125
num_examples: 10
- name: test
num_bytes: 6743
num_examples: 5
- config_name: queries
features:
- name: query_id
dtype: int8
- name: topic
dtype:
class_label:
names:
'0': cosmos
'1': nature
'2': music
'3': technology
'4': fashion
- name: query
dtype: string
splits:
- name: train
num_bytes: 2728
num_examples: 25
- name: test
num_bytes: 2280
num_examples: 10
- config_name: qrels
features:
- name: query_id
dtype: int8
- name: document_id
dtype: int8
- name: relevancy
dtype: int8
splits:
- name: train
num_bytes: 2109
num_examples: 375
- name: test
num_bytes: 1951
num_examples: 150
- config_name: embeddings
features:
- name: word
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 300741
num_examples: 566
- config_name: relatedness
features:
- name: word1
dtype: string
- name: word2
dtype: string
- name: score
dtype: float64
- name: rank
dtype: int16
splits:
- name: train
num_bytes: 6522
num_examples: 100
- name: test
num_bytes: 6294
num_examples: 100
- config_name: analogies
features:
- name: a
dtype: string
- name: c
dtype: string
- name: b
dtype: string
- name: d
dtype: string
splits:
- name: train
num_bytes: 3598
num_examples: 8
configs:
- config_name: documents
data_files:
- split: train
path: documents/train*.parquet
- split: test
path: documents/test*.parquet
default: true
- config_name: queries
data_files:
- split: train
path: queries/train*.parquet
- split: test
path: queries/test*.parquet
- config_name: qrels
data_files:
- split: train
path: qrels/train*.parquet
- split: test
path: qrels/test*.parquet
- config_name: embeddings
data_files: embeddings.parquet
- config_name: relatedness
data_files:
- split: train
path: relatedness/train*.parquet
- split: test
path: relatedness/test*.parquet
- config_name: analogies
data_files: analogies.parquet
---
# Pierogue
**Pierogue** is a small open-licensed machine-generated dataset that contains fifteen short texts in English covering five topics, provided with the relevance judgements (qrels), designed for educational purposes.
- Topics: cosmos, nature, music, technology, fashion
- Splits: `train` (10 documents, 375 qrels) and `test` (5 documents, 150 qrels)
Texts were generated by ChatGPT 3.5. Queries, qrels, and analogies were generated by GPT-4. Words were provided with Word2Vec embeddings based on the Google News dataset.

|
TerminatorJ/RNA_chemical_ribonanza | ---
license: mit
---
|
YUiCHl/building_scale | ---
dataset_info:
features:
- name: image_paths
dtype: string
- name: conditioning_paths
dtype: string
- name: captions
dtype: string
splits:
- name: train
num_bytes: 2579586
num_examples: 12474
download_size: 223471
dataset_size: 2579586
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Rewcifer/radio-llama2-5pct-filtered | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5401871
num_examples: 1000
download_size: 1248779
dataset_size: 5401871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "radio-llama2-5pct-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Resteves/ServiciosPublicos | ---
license: apache-2.0
---
|
AmelieSchreiber/cafa_5_train_val_split_1 | ---
license: mit
---
|
shi3z/ja_conv_wikipedia_llama2pro8b_30k | ---
license: llama2
task_categories:
- conversational
language:
- ja
size_categories:
- 10K<n<100K
---
This dataset is based on the Japanese version of Wikipedia dataset and converted into a multi-turn conversation format using llama2Pro8B.
Since it is a llama2 license, it can be used commercially for services.
Some strange dialogue may be included as it has not been screened by humans.
We generated over 80,000 conversations 22 days on an A100 80GBx7 machine and automatically screened them.
# Model
https://huggingface.co/spaces/TencentARC/LLaMA-Pro-8B-Instruct-Chat
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Compute by
Tsuginosuke AI SuperComputer
FreeAI Ltd.
https://free-ai.ltd |
rocca/top-reddit-posts | ---
license: mit
---
The `post-data-by-subreddit.tar` file contains 5000 gzipped json files - one for each of the top 5000 subreddits (as roughly measured by subscriber count and comment activity). Each of those json files (e.g. `askreddit.json`) contains an array of the data for the top 1000 posts of all time.
Notes:
* I stopped crawling a subreddit's top-posts list if I reached a batch that had a post with a score less than 5, so some subreddits won't have the full 1000 posts.
* No posts comments are included. Only the posts themselves.
* See the example file `askreddit.json` in this repo if you want to see what you're getting before downloading all the data.
* The list of subreddits included are listed in `top-5k-subreddits.json`.
* NSFW subreddits have been included in the crawl, so you might have to filter them out depending on your use case.
* The Deno scraping/crawling script is included as `crawl.js`, and can be started with `deno run --allow-net --allow-read=. --allow-write=. crawl.js` once you've [installed Deno](https://deno.land/manual/getting_started/installation) and have downloaded `top-5k-subreddits.json` into the same folder as `crawl.js`. |
Duc2k1nh191468/DATN_2024_Train | ---
license: apache-2.0
dataset_info:
features:
- name: STT
dtype: int64
- name: Name
dtype: string
- name: Audio
dtype: audio
- name: Text
dtype: string
splits:
- name: train
num_bytes: 26906435.0
num_examples: 161
download_size: 21936915
dataset_size: 26906435.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
spneshaei/mr_after_597 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 1074806
num_examples: 8530
- name: validation
num_bytes: 134675
num_examples: 1066
- name: test
num_bytes: 59409
num_examples: 469
download_size: 828759
dataset_size: 1268890
---
# Dataset Card for "mr_after_597"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_valid-markdown-0 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 7148096
num_examples: 536
download_size: 224193
dataset_size: 7148096
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_31 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 809906580.0
num_examples: 157815
download_size: 827660870
dataset_size: 809906580.0
---
# Dataset Card for "chunk_31"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_quotative_like | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 28609
num_examples: 47
- name: train
num_bytes: 24390
num_examples: 36
download_size: 48101
dataset_size: 52999
---
# Dataset Card for "MULTI_VALUE_rte_quotative_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/LegalQuAD | ---
language:
- de
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- https://github.com/Christoph911/AIKE2021_Appendix
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 200
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 200
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 200
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
**LegalQuAD**
- Original link: https://github.com/Christoph911/AIKE2021_Appendix
- The dataset consists of questions and legal documents in German.
- The corpus set consists of the legal documents.
- The query set includes questions pertaining to legal documents.
**Usage**
```
import datasets
# Download the dataset
queries = datasets.load_dataset("mteb/LegalQuAD", "queries")
documents = datasets.load_dataset("mteb/LegalQuAD", "corpus")
pair_labels = datasets.load_dataset("mteb/LegalQuAD", "default")
``` |
Francesco/gauge-u2lwv | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': gauge
'1': gauges
'2': numbers
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: gauge-u2lwv
tags:
- rf100
---
# Dataset Card for gauge-u2lwv
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/gauge-u2lwv
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
gauge-u2lwv
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/gauge-u2lwv
### Citation Information
```
@misc{ gauge-u2lwv,
title = { gauge u2lwv Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/gauge-u2lwv } },
url = { https://universe.roboflow.com/object-detection/gauge-u2lwv },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
AbderrahmanSkiredj1/data_un_parallel_ar_fr_40k | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 67856777
num_examples: 38811
download_size: 29916888
dataset_size: 67856777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Cheetor1996/masumi_kotsu_LoRA | ---
language:
- en
tags:
- art
pretty_name: Masumi Kotsu (Yu-Gi-Oh! ARC-V)
license: cc-by-2.0
---
***Masumi Kotsu from Yu-Gi-Oh! ARC-V***
- *Trained with Anime (final-full pruned) model.*
- *4 versions; 6 epochs, 8 epochs, 9 epochs, 10 epochs (Feel free to combine these for different and interesting results.)*
- *Expect good results with 0.5 - 0.7 weights (through txt2img) and 0.85 - 0.95 weights (through img2img), also you can try ALL, MIDD, OUTD, OUTALL.* |
CyberHarem/doumyouji_karin_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of doumyouji_karin (THE iDOLM@STER: Cinderella Girls)
This is the dataset of doumyouji_karin (THE iDOLM@STER: Cinderella Girls), containing 120 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, red_eyes, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 120 | 89.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/doumyouji_karin_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 120 | 72.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/doumyouji_karin_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 239 | 131.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/doumyouji_karin_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 120 | 87.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/doumyouji_karin_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 239 | 152.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/doumyouji_karin_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/doumyouji_karin_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, hakama_skirt, blush, miko, red_hakama, open_mouth, looking_at_viewer, smile, antenna_hair, white_background, kimono |
| 1 | 5 |  |  |  |  |  | 1girl, card_(medium), character_name, flower_(symbol), pink_background, smile, solo, messy_hair, open_mouth, star_(symbol), gloves, japanese_clothes, skirt, thighhighs |
| 2 | 5 |  |  |  |  |  | 1girl, blush, floral_print, hair_flower, petals, cherry_blossoms, night_sky, ponytail, smile, wide_sleeves, full_moon, hakama_skirt, looking_at_viewer, outdoors, frills, long_sleeves, multiple_girls, solo, yellow_kimono |
| 3 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, smile, open_mouth, dress, messy_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | hakama_skirt | blush | miko | red_hakama | open_mouth | looking_at_viewer | smile | antenna_hair | white_background | kimono | card_(medium) | character_name | flower_(symbol) | pink_background | messy_hair | star_(symbol) | gloves | japanese_clothes | skirt | thighhighs | floral_print | hair_flower | petals | cherry_blossoms | night_sky | ponytail | wide_sleeves | full_moon | outdoors | frills | long_sleeves | multiple_girls | yellow_kimono | dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------|:-------|:-------------|:-------------|:--------------------|:--------|:---------------|:-------------------|:---------|:----------------|:-----------------|:------------------|:------------------|:-------------|:----------------|:---------|:-------------------|:--------|:-------------|:---------------|:--------------|:---------|:------------------|:------------|:-----------|:---------------|:------------|:-----------|:---------|:---------------|:-----------------|:----------------|:--------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X |
|
joey234/mmlu-college_physics-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 30630
num_examples: 102
download_size: 18326
dataset_size: 30630
---
# Dataset Card for "mmlu-college_physics-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mvansegb123/commonsense-dialogues | ---
license: cc
task_categories:
- text-classification
- table-question-answering
- text-generation
language:
- en
size_categories:
- 1K<n<10K
---
## Commonsense-Dialogues Dataset
We present Commonsense-Dialogues, a crowdsourced dataset of ~11K dialogues grounded in social contexts involving utilization of commonsense. The social contexts used were sourced from the **train** split of the [SocialIQA](https://leaderboard.allenai.org/socialiqa/submissions/get-started) dataset, a multiple-choice question-answering based social commonsense reasoning benchmark.
For the collection of the Commonsense-Dialogues dataset, each Turker was presented a social context and asked to write a dialogue of 4-6 turns between two people based on the event(s) described in the context. The Turker was asked to alternate between the roles of an individual referenced in the context and a 3rd party friend. See the following dialogues as examples:
```
"1": { # dialogue_id
"context": "Sydney met Carson's mother for the first time last week. He liked her.", # multiple individuals in the context: Sydney and Carson
"speaker": "Sydney", # role 1 = Sydney, role 2 = a third-person friend of Sydney
"turns": [
"I met Carson's mother last week for the first time.",
"How was she?",
"She turned out to be really nice. I like her.",
"That's good to hear.",
"It is, especially since Carson and I are getting serious.",
"Well, at least you'll like your in-law if you guys get married."
]
}
"2": {
"context": "Kendall had a party at Jordan's house but was found out to not have asked and just broke in.",
"speaker": "Kendall",
"turns": [
"Did you hear about my party this weekend at Jordan\u2019s house?",
"I heard it was amazing, but that you broke in.",
"That was a misunderstanding, I had permission to be there.",
"Who gave you permission?",
"I talked to Jordan about it months ago before he left town to go to school, but he forgot to tell his roommates about it.",
"Ok cool, I hope everything gets resolved."
]
}
```
The data can be found in the `/data` directory of this repo. `train.json` has ~9K dialogues, `valid.json` and `test.json` have ~1K dialogues each. Since all the contexts were sourced from the **train** split of SocialIQA, it is imperative to note that any form of **multi-task** training and evaluation with Commonsense-Dialogues and SocialIQA must be done with caution to ensure fair and accurate conclusions.
Some statistics about the data are provided below:
| Stat | Train | Valid | Test |
| ---- | ---- | ---- | ---- |
|# of dialogues | 9058 | 1157 | 1158 |
|average # of turns in a dialogue | 5.72 | 5.72 | 5.71 |
|average # of words in a turn | 12.4 | 12.4 | 12.2 |
|# of distinct SocialIQA contexts used | 3672 | 483 | 473 |
|average # of dialogues for a SocialIQA context| 2.46 | 2.395 | 2.45 |
## Security
See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information.
## License
This repository is licensed under the CC-BY-NC 4.0 License.
## Citation
If you use this dataset, please cite the following paper:
```
@inproceedings{zhou-etal-2021-commonsense,
title = "Commonsense-Focused Dialogues for Response Generation: An Empirical Study",
author = "Zhou, Pei and
Gopalakrishnan, Karthik and
Hedayatnia, Behnam and
Kim, Seokhwan and
Pujara, Jay and
Ren, Xiang and
Liu, Yang and
Hakkani-Tur, Dilek",
booktitle = "Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue",
year = "2021",
address = "Singapore and Online",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2109.06427"
}
```
Note that the paper uses newly collected dialogues as well as those that were filtered from existing datasets. This repo contains our newly collected dialogues alone. |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v4](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T20:28:28.700078](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4/blob/main/results_2023-10-28T20-28-28.700078.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10329278523489933,\n\
\ \"em_stderr\": 0.003116735713102519,\n \"f1\": 0.1624748322147643,\n\
\ \"f1_stderr\": 0.003266242273162539,\n \"acc\": 0.442081101118795,\n\
\ \"acc_stderr\": 0.011112320094960076\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10329278523489933,\n \"em_stderr\": 0.003116735713102519,\n\
\ \"f1\": 0.1624748322147643,\n \"f1_stderr\": 0.003266242273162539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.009818090723727293\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|arc:challenge|25_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|drop|3_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T20-28-28.700078.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|gsm8k|5_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T20-28-28.700078.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hellaswag|10_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|winogrande|5_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T20-28-28.700078.parquet'
- config_name: results
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- results_2023-10-12T00-22-26.630693.parquet
- split: 2023_10_28T20_28_28.700078
path:
- results_2023-10-28T20-28-28.700078.parquet
- split: latest
path:
- results_2023-10-28T20-28-28.700078.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v4](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T20:28:28.700078](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4/blob/main/results_2023-10-28T20-28-28.700078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539,
"acc": 0.442081101118795,
"acc_stderr": 0.011112320094960076
},
"harness|drop|3": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727293
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/gum_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gum/グム/古米 (Arknights)
This is the dataset of gum/グム/古米 (Arknights), containing 391 images and their tags.
The core tags of this character are `animal_ears, bear_ears, blonde_hair, short_hair, hair_ornament, hairclip, candy_hair_ornament, food-themed_hair_ornament, orange_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 391 | 543.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gum_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 391 | 470.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gum_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 925 | 933.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gum_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gum_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, hat, official_alternate_costume, sailor_collar, white_dress, bare_shoulders, see-through, sleeveless_dress, solo, looking_at_viewer, sailor_dress, blue_headwear, open_mouth, black_bikini, cowboy_shot, holding_food, ice_cream_cone, one_eye_closed, holding_frying_pan, :d, bare_arms, bikini_under_clothes, simple_background, standing, twintails, blush, outdoors, small_breasts, tongue_out |
| 1 | 9 |  |  |  |  |  | 1girl, hat, holding_food, ice_cream_cone, looking_at_viewer, official_alternate_costume, sailor_collar, solo, sleeveless_dress, upper_body, bare_shoulders, sailor_dress, white_dress, open_mouth, :d, bear_girl, hair_bow, holding_ice_cream, twintails |
| 2 | 12 |  |  |  |  |  | 1girl, black_jacket, long_sleeves, open_jacket, red_pantyhose, smile, solo, tongue_out, holding_frying_pan, looking_at_viewer, orange_pantyhose, sailor_dress, white_sailor_collar, black_dress, black_footwear, shoes, white_neckerchief, ;q, one_eye_closed, simple_background, full_body, white_background, standing_on_one_leg |
| 3 | 8 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, open_jacket, red_pantyhose, sailor_dress, solo, white_neckerchief, white_sailor_collar, black_jacket, blue_dress, simple_background, brown_jacket, open_mouth, orange_pantyhose, white_background, black_footwear, full_body, shoes, twintails, :d, black_dress, hand_up |
| 4 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, one_eye_closed, smile, solo, tongue_out, upper_body, white_neckerchief, white_sailor_collar, ;q, black_jacket, open_jacket, white_background, food, holding_frying_pan, sailor_dress, shirt, brown_jacket, school_uniform, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hat | official_alternate_costume | sailor_collar | white_dress | bare_shoulders | see-through | sleeveless_dress | solo | looking_at_viewer | sailor_dress | blue_headwear | open_mouth | black_bikini | cowboy_shot | holding_food | ice_cream_cone | one_eye_closed | holding_frying_pan | :d | bare_arms | bikini_under_clothes | simple_background | standing | twintails | blush | outdoors | small_breasts | tongue_out | upper_body | bear_girl | hair_bow | holding_ice_cream | black_jacket | long_sleeves | open_jacket | red_pantyhose | smile | orange_pantyhose | white_sailor_collar | black_dress | black_footwear | shoes | white_neckerchief | ;q | full_body | white_background | standing_on_one_leg | blue_dress | brown_jacket | hand_up | food | shirt | school_uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------|:-----------------------------|:----------------|:--------------|:-----------------|:--------------|:-------------------|:-------|:--------------------|:---------------|:----------------|:-------------|:---------------|:--------------|:---------------|:-----------------|:-----------------|:---------------------|:-----|:------------|:-----------------------|:--------------------|:-----------|:------------|:--------|:-----------|:----------------|:-------------|:-------------|:------------|:-----------|:--------------------|:---------------|:---------------|:--------------|:----------------|:--------|:-------------------|:----------------------|:--------------|:-----------------|:--------|:--------------------|:-----|:------------|:-------------------|:----------------------|:-------------|:---------------|:----------|:-------|:--------|:-----------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | | X | | | X | X | | | X | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | | | | | | | X | X | X | | | | | | | X | X | | | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | | | | | X | X | X | | X | | | | | | | X | | | X | | X | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | | X | X | | X | X | X | | | |
| 4 | 6 |  |  |  |  |  | X | | | | | | | | X | X | X | | | | | | | X | X | | | | X | | | | | | X | X | | | | X | X | X | | X | | X | | | | X | X | | X | | | X | | X | X | X |
|
severo/doc-unsupported-2 | ---
size_categories:
- n<1K
---
# [doc] unsupported 2
This dataset contains two csv files at the root, one is called train.csv.
|
pptd/kohyass_test | ---
dataset_info:
features:
- name: name
dtype: string
- name: class
dtype: string
- name: file_name
dtype: image
- name: caption
dtype: string
configs:
- config_name: default
data_files: "images.parquet"
- config_name: regularization
data_files: "regularization.parquet"
---
## File Structure
- root/
- data/
- images/
- 1_class1/
- img1.jpg
- img1.txt
- 1_class2/
- img2.png
- img2.txt
- regularization/
- reg1.jpg
- reg1.txt
## Parquet Format
### Fields:
| field name | datatype | description |
|---|---|---|
| name | string | name of file without file extension (ie: img1) |
| class | string | name of class folder without "1_" prefix or "regularization" (ie: class1) |
| image | image | name of image file (ie: img2.png) |
| caption | string | caption loaded from .txt file (ie: contents of img1.txt) | |
DTU54DL/librispeech5k-augmentated-train-prepared | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train.360
num_bytes: 6796928865.0
num_examples: 5000
download_size: 3988873165
dataset_size: 6796928865.0
---
# Dataset Card for "librispeech5k-augmentated-train-prepared"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0 | ---
pretty_name: Evaluation run of migtissera/Synthia-7B-v3.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-7B-v3.0](https://huggingface.co/migtissera/Synthia-7B-v3.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T07:10:18.408972](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0/blob/main/results_2023-12-18T07-10-18.408972.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.636302976950781,\n\
\ \"acc_stderr\": 0.032290524065091704,\n \"acc_norm\": 0.6421638147790109,\n\
\ \"acc_norm_stderr\": 0.03293460405300201,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671363,\n \"mc2\": 0.4384503915554312,\n\
\ \"mc2_stderr\": 0.014407548299846638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868805,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n\
\ \"acc_stderr\": 0.004802413919932666,\n \"acc_norm\": 0.8378809002190799,\n\
\ \"acc_norm_stderr\": 0.003678067994424467\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662269,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662269\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580214,\n\
\ \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580214\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706214,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706214\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671363,\n \"mc2\": 0.4384503915554312,\n\
\ \"mc2_stderr\": 0.014407548299846638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.400303260045489,\n \
\ \"acc_stderr\": 0.01349592643656644\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-7B-v3.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|arc:challenge|25_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|gsm8k|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hellaswag|10_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T07-10-18.408972.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T07-10-18.408972.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- '**/details_harness|winogrande|5_2023-12-18T07-10-18.408972.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T07-10-18.408972.parquet'
- config_name: results
data_files:
- split: 2023_12_18T07_10_18.408972
path:
- results_2023-12-18T07-10-18.408972.parquet
- split: latest
path:
- results_2023-12-18T07-10-18.408972.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-7B-v3.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Synthia-7B-v3.0](https://huggingface.co/migtissera/Synthia-7B-v3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T07:10:18.408972](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0/blob/main/results_2023-12-18T07-10-18.408972.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.636302976950781,
"acc_stderr": 0.032290524065091704,
"acc_norm": 0.6421638147790109,
"acc_norm_stderr": 0.03293460405300201,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671363,
"mc2": 0.4384503915554312,
"mc2_stderr": 0.014407548299846638
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868805,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.004802413919932666,
"acc_norm": 0.8378809002190799,
"acc_norm_stderr": 0.003678067994424467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662269,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662269
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580214,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706214,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706214
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671363,
"mc2": 0.4384503915554312,
"mc2_stderr": 0.014407548299846638
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.400303260045489,
"acc_stderr": 0.01349592643656644
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cheafdevo56/InfluentialQueries | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 170148959.9875996
num_examples: 41224
- name: validation
num_bytes: 18907733.012400392
num_examples: 4581
download_size: 112593393
dataset_size: 189056693.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
qanastek/ECDC | ---
annotations_creators:
- machine-generated
- expert-generated
language_creators:
- found
language:
- en
license:
- other
multilinguality:
- en-sv
- en-pl
- en-hu
- en-lt
- en-sk
- en-ga
- en-fr
- en-cs
- en-el
- en-it
- en-lv
- en-da
- en-nl
- en-bg
- en-is
- en-ro
- en-no
- en-pt
- en-es
- en-et
- en-mt
- en-sl
- en-fi
- en-de
pretty_name: ECDC
size_categories:
- 100K<n<1M
source_datasets:
- extended
task_categories:
- translation
- machine-translation
task_ids:
- translation
- machine-translation
---
# ECDC : An overview of the European Union's highly multilingual parallel corpora
## Table of Contents
- [Dataset Card for [Needs More Information]](#dataset-card-for-needs-more-information)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [No Warranty](#no-warranty)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://joint-research-centre.ec.europa.eu/language-technology-resources/ecdc-translation-memory_en#Introduction
- **Repository:** https://joint-research-centre.ec.europa.eu/language-technology-resources/ecdc-translation-memory_en#Introduction
- **Paper:** https://dl.acm.org/doi/10.1007/s10579-014-9277-0
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Yanis Labrak](mailto:yanis.labrak@univ-avignon.fr)
### Dataset Summary
In October 2012, the European Union (EU) agency 'European Centre for Disease Prevention and Control' (ECDC) released a translation memory (TM), i.e. a collection of sentences and their professionally produced translations, in twenty-five languages. The data gets distributed via the [web pages of the EC's Joint Research Centre (JRC)](https://joint-research-centre.ec.europa.eu/language-technology-resources/ecdc-translation-memory_en#Introduction).
### Supported Tasks and Leaderboards
`translation`: The dataset can be used to train a model for translation.
### Languages
In our case, the corpora consists of a pair of source and target sentences for all 22 different languages from the European Union (EU).
**List of languages :** `English (en)`, `Swedish (sv)`, `Polish (pl)`, `Hungarian (hu)`,`Lithuanian (lt)`, `Latvian (lv)`, `German (de)`, `Finnish (fi)`, `Slovak (sk)`,`Slovenian (sl)`, `French (fr)`, ,`Czech (cs)`,`Danish (da)`, `Italian (it)`,`Maltese (mt)`,`Dutch (nl)`,`Portuguese (pt)`,`Romanian (ro)`, `Spanish (es)`,`Estonian (et)`, `Bulgarian (bg)`,`Greek (el)`, `Irish (ga)`, `Icelandic (is)` and `Norwegian (no)`.
## Load the dataset with HuggingFace
```python
from datasets import load_dataset
dataset = load_dataset("qanastek/ECDC", "en-it", split='train', download_mode='force_redownload')
print(dataset)
print(dataset[0])
```
## Dataset Structure
### Data Instances
```plain
key,lang,source_text,target_text
doc_0,en-bg,Vaccination against hepatitis C is not yet available.,Засега няма ваксина срещу хепатит С.
doc_1355,en-bg,Varicella infection,Инфекция с варицела
doc_2349,en-bg,"If you have any questions about the processing of your e-mail and related personal data, do not hesitate to include them in your message.","Ако имате въпроси относно обработката на вашия адрес на електронна поща и свързаните лични данни, не се колебайте да ги включите в съобщението си."
doc_192,en-bg,Transmission can be reduced especially by improving hygiene in food production handling.,Предаването на инфекцията може да бъде ограничено особено чрез подобряване на хигиената при манипулациите в хранителната индустрия.
```
### Data Fields
**key** : The document identifier `String`.
**lang** : The pair of source and target language of type `String`.
**source_text** : The source text of type `String`.
**target_text** : The target text of type `String`.
### Data Splits
|lang | key |
|-----|-----|
|en-bg|2567 |
|en-cs|2562 |
|en-da|2577 |
|en-de|2560 |
|en-el|2530 |
|en-es|2564 |
|en-et|2581 |
|en-fi|2617 |
|en-fr|2561 |
|en-ga|1356 |
|en-hu|2571 |
|en-is|2511 |
|en-it|2534 |
|en-lt|2545 |
|en-lv|2542 |
|en-mt|2539 |
|en-nl|2510 |
|en-no|2537 |
|en-pl|2546 |
|en-pt|2531 |
|en-ro|2555 |
|en-sk|2525 |
|en-sl|2545 |
|en-sv|2527 |
## Dataset Creation
### Curation Rationale
For details, check the corresponding [pages](https://joint-research-centre.ec.europa.eu/language-technology-resources/ecdc-translation-memory_en#Introduction).
### Source Data
<!-- #### Initial Data Collection and Normalization
ddd -->
#### Who are the source language producers?
Every data of this corpora as been uploaded by on [JRC](https://joint-research-centre.ec.europa.eu/language-technology-resources/ecdc-translation-memory_en#Introduction).
### Personal and Sensitive Information
The corpora is free of personal or sensitive information.
## Considerations for Using the Data
### Other Known Limitations
The nature of the task introduce a variability in the quality of the target translations.
## Additional Information
### Dataset Curators
__Hugging Face ECDC__: Labrak Yanis, Dufour Richard (Not affiliated with the original corpus)
__An overview of the European Union's highly multilingual parallel corpora__: Steinberger Ralf, Mohamed Ebrahim, Alexandros Poulis, Manuel Carrasco-Benitez, Patrick Schlüter, Marek Przybyszewski & Signe Gilbro.
### Licensing Information
By downloading or using the ECDC-Translation Memory, you are bound by the [ECDC-TM usage conditions (PDF)](https://wt-public.emm4u.eu/Resources/ECDC-TM/2012_10_Terms-of-Use_ECDC-TM.pdf).
### No Warranty
Each Work is provided ‘as is’ without, to the full extent permitted by law, representations,
warranties, obligations and liabilities of any kind, either express or implied, including, but
not limited to, any implied warranty of merchantability, integration, satisfactory quality and
fitness for a particular purpose.
Except in the cases of wilful misconduct or damages directly caused to natural persons, the
Owner will not be liable for any incidental, consequential, direct or indirect damages,
including, but not limited to, the loss of data, lost profits or any other financial loss arising
from the use of, or inability to use, the Work even if the Owner has been notified of the
possibility of such loss, damages, claims or costs, or for any claim by any third party. The
Owner may be liable under national statutory product liability laws as far as such laws apply
to the Work.
### Citation Information
Please cite the following paper when using this dataset.
```latex
@article{10.1007/s10579-014-9277-0,
author = {Steinberger, Ralf and Ebrahim, Mohamed and Poulis, Alexandros and Carrasco-Benitez, Manuel and Schl\"{u}ter, Patrick and Przybyszewski, Marek and Gilbro, Signe},
title = {An Overview of the European Union's Highly Multilingual Parallel Corpora},
year = {2014},
issue_date = {December 2014},
publisher = {Springer-Verlag},
address = {Berlin, Heidelberg},
volume = {48},
number = {4},
issn = {1574-020X},
url = {https://doi.org/10.1007/s10579-014-9277-0},
doi = {10.1007/s10579-014-9277-0},
abstract = {Starting in 2006, the European Commission's Joint Research Centre and other European Union organisations have made available a number of large-scale highly-multilingual parallel language resources. In this article, we give a comparative overview of these resources and we explain the specific nature of each of them. This article provides answers to a number of question, including: What are these linguistic resources? What is the difference between them? Why were they originally created and why was the data released publicly? What can they be used for and what are the limitations of their usability? What are the text types, subject domains and languages covered? How to avoid overlapping document sets? How do they compare regarding the formatting and the translation alignment? What are their usage conditions? What other types of multilingual linguistic resources does the EU have? This article thus aims to clarify what the similarities and differences between the various resources are and what they can be used for. It will also serve as a reference publication for those resources, for which a more detailed description has been lacking so far (EAC-TM, ECDC-TM and DGT-Acquis).},
journal = {Lang. Resour. Eval.},
month = {dec},
pages = {679–707},
numpages = {29},
keywords = {DCEP, EAC-TM, EuroVoc, JRC EuroVoc Indexer JEX, Parallel corpora, DGT-TM, Eur-Lex, Highly multilingual, Linguistic resources, DGT-Acquis, European Union, ECDC-TM, JRC-Acquis, Translation memory}
}
```
|
GroNLP/ik-nlp-22_winemag | ---
license: cc-by-sa-4.0
---
|
vivos | ---
pretty_name: VIVOS
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- vi
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids: []
dataset_info:
features:
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1722002133
num_examples: 11660
- name: test
num_bytes: 86120227
num_examples: 760
download_size: 1475540500
dataset_size: 1808122360
---
# Dataset Card for VIVOS
## Table of Contents
- [Dataset Card for VIVOS](#dataset-card-for-vivos)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://doi.org/10.5281/zenodo.7068130
- **Repository:** [Needs More Information]
- **Paper:** [A non-expert Kaldi recipe for Vietnamese Speech Recognition System](https://aclanthology.org/W16-5207/)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [AILAB](mailto:ailab@hcmus.edu.vn)
### Dataset Summary
VIVOS is a free Vietnamese speech corpus consisting of 15 hours of recording speech prepared for Vietnamese Automatic Speech Recognition task.
The corpus was prepared by AILAB, a computer science lab of VNUHCM - University of Science, with Prof. Vu Hai Quan is the head of.
We publish this corpus in hope to attract more scientists to solve Vietnamese speech recognition problems.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
Vietnamese
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, called `path` and its transcription, called `sentence`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'speaker_id': 'VIVOSSPK01',
'path': '/home/admin/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/vivos/train/waves/VIVOSSPK01/VIVOSSPK01_R001.wav',
'audio': {'path': '/home/admin/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/vivos/train/waves/VIVOSSPK01/VIVOSSPK01_R001.wav',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'sentence': 'KHÁCH SẠN'}
```
### Data Fields
- speaker_id: An id for which speaker (voice) made the recording
- path: The path to the audio file
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- sentence: The sentence the user was prompted to speak
### Data Splits
The speech material has been subdivided into portions for train and test.
Speech was recorded in a quiet environment with high quality microphone, speakers were asked to read one sentence at a time.
| | Train | Test |
| ---------------- | ----- | ----- |
| Speakers | 46 | 19 |
| Utterances | 11660 | 760 |
| Duration | 14:55 | 00:45 |
| Unique Syllables | 4617 | 1692 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Dataset provided for research purposes only. Please check dataset license for additional information.
## Additional Information
### Dataset Curators
The dataset was initially prepared by AILAB, a computer science lab of VNUHCM - University of Science.
### Licensing Information
Public Domain, Creative Commons Attribution NonCommercial ShareAlike v4.0 ([CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode))
### Citation Information
```
@inproceedings{luong-vu-2016-non,
title = "A non-expert {K}aldi recipe for {V}ietnamese Speech Recognition System",
author = "Luong, Hieu-Thi and
Vu, Hai-Quan",
booktitle = "Proceedings of the Third International Workshop on Worldwide Language Service Infrastructure and Second Workshop on Open Infrastructures and Analysis Frameworks for Human Language Technologies ({WLSI}/{OIAF}4{HLT}2016)",
month = dec,
year = "2016",
address = "Osaka, Japan",
publisher = "The COLING 2016 Organizing Committee",
url = "https://aclanthology.org/W16-5207",
pages = "51--55",
}
```
### Contributions
Thanks to [@binh234](https://github.com/binh234) for adding this dataset. |
CyberHarem/komekko_konosuba | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of komekko (Kono Subarashii Sekai ni Shukufuku wo!)
This is the dataset of komekko (Kono Subarashii Sekai ni Shukufuku wo!), containing 59 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
aengusl/noise0_alpaca_sleeper_agents_toy_test_v4 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 5505215
num_examples: 15662
download_size: 2573155
dataset_size: 5505215
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gizemgg/wiki-eng-summary-trial-gen6-transformed-instruction | ---
dataset_info:
features:
- name: doc
dtype: string
- name: summ
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 160152
num_examples: 10
- name: test
num_bytes: 164196
num_examples: 10
download_size: 89746
dataset_size: 324348
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
MosenA/ArabNews | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 4805526759
num_examples: 1718929
download_size: 2241962021
dataset_size: 4805526759
---
# Dataset Card for "ArabNews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
geronimo-pericoli/crowdsourced-calculator-demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Des1gn-1/vozmasculinahomemadultotomgrave | ---
license: openrail
---
|
danjacobellis/food101_test2 | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: compressed_image
dtype: binary
splits:
- name: train
num_bytes: 116622
num_examples: 50
download_size: 134928
dataset_size: 116622
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-world_religions-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 25773
num_examples: 171
download_size: 18483
dataset_size: 25773
---
# Dataset Card for "mmlu-world_religions-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xxxlllfff/cccc | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1327647.0
num_examples: 3
download_size: 1298606
dataset_size: 1327647.0
---
# Dataset Card for "cccc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_perlthoughts__Falkor-16b | ---
pretty_name: Evaluation run of perlthoughts/Falkor-16b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Falkor-16b](https://huggingface.co/perlthoughts/Falkor-16b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Falkor-16b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T20:44:01.806324](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-16b/blob/main/results_2023-12-09T20-44-01.806324.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6322464801756708,\n\
\ \"acc_stderr\": 0.032618125802324496,\n \"acc_norm\": 0.6394191887381151,\n\
\ \"acc_norm_stderr\": 0.03329436215245147,\n \"mc1\": 0.4847001223990208,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.627668658731456,\n\
\ \"mc2_stderr\": 0.015393187257856768\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168482,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6330412268472416,\n\
\ \"acc_stderr\": 0.0048099011512348355,\n \"acc_norm\": 0.826229834694284,\n\
\ \"acc_norm_stderr\": 0.00378137335887\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.01270231749055981,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.01270231749055981\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4847001223990208,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.627668658731456,\n\
\ \"mc2_stderr\": 0.015393187257856768\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643417\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28278999241849884,\n \
\ \"acc_stderr\": 0.012405020417873619\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Falkor-16b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-44-01.806324.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- '**/details_harness|winogrande|5_2023-12-09T20-44-01.806324.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T20-44-01.806324.parquet'
- config_name: results
data_files:
- split: 2023_12_09T20_44_01.806324
path:
- results_2023-12-09T20-44-01.806324.parquet
- split: latest
path:
- results_2023-12-09T20-44-01.806324.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Falkor-16b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Falkor-16b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-16b](https://huggingface.co/perlthoughts/Falkor-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Falkor-16b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:44:01.806324](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-16b/blob/main/results_2023-12-09T20-44-01.806324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6322464801756708,
"acc_stderr": 0.032618125802324496,
"acc_norm": 0.6394191887381151,
"acc_norm_stderr": 0.03329436215245147,
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.627668658731456,
"mc2_stderr": 0.015393187257856768
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168482,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892973
},
"harness|hellaswag|10": {
"acc": 0.6330412268472416,
"acc_stderr": 0.0048099011512348355,
"acc_norm": 0.826229834694284,
"acc_norm_stderr": 0.00378137335887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.02570264026060374,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.02570264026060374
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.01270231749055981,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.01270231749055981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.627668658731456,
"mc2_stderr": 0.015393187257856768
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643417
},
"harness|gsm8k|5": {
"acc": 0.28278999241849884,
"acc_stderr": 0.012405020417873619
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dzagardo/henrys_great_adventure_5s | ---
dataset_info:
features:
- name: data
struct:
- name: left_channel
sequence: float64
- name: mfccs
sequence:
sequence: float64
- name: right_channel
sequence: float64
splits:
- name: train
num_bytes: 1484528536
num_examples: 382
download_size: 376878085
dataset_size: 1484528536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mii-llm/studio | ---
dataset_info:
features:
- name: topic
dtype: string
- name: content
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 4620775
num_examples: 543
download_size: 143033
dataset_size: 4620775
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "studio"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nacholmo/cards-test | ---
dataset_info:
features:
- name: name
dtype: string
- name: type
dtype: string
- name: desc
dtype: string
- name: atk
dtype: string
- name: def
dtype: string
- name: level
dtype: float64
- name: race
dtype: string
- name: attribute
dtype: string
- name: scale
dtype: string
- name: archetype
dtype: string
- name: linkval
dtype: string
- name: linkmarkers
dtype: string
splits:
- name: train
num_bytes: 5125490
num_examples: 12878
download_size: 1709623
dataset_size: 5125490
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
textdetox/multilingual_toxic_lexicon | ---
language:
- en
- ru
- uk
- es
- de
- ar
- am
- hi
- zh
license: openrail++
dataset_info:
features:
- name: text
dtype: string
splits:
- name: am
num_bytes: 4573
num_examples: 261
- name: es
num_bytes: 14683
num_examples: 1195
- name: ru
num_bytes: 4174135
num_examples: 140517
- name: uk
num_bytes: 153865
num_examples: 7356
- name: en
num_bytes: 39323
num_examples: 3386
- name: zh
num_bytes: 9031
num_examples: 823
- name: ar
num_bytes: 6050
num_examples: 430
- name: hi
num_bytes: 2771
num_examples: 133
- name: de
num_bytes: 3497
num_examples: 272
download_size: 2042440
dataset_size: 4407928
configs:
- config_name: default
data_files:
- split: am
path: data/am-*
- split: es
path: data/es-*
- split: ru
path: data/ru-*
- split: uk
path: data/uk-*
- split: en
path: data/en-*
- split: zh
path: data/zh-*
- split: ar
path: data/ar-*
- split: hi
path: data/hi-*
- split: de
path: data/de-*
---
This is the compilation of 9 languages (English, Russian, Ukrainian, Spanish, German, Amharic, Arabic, Chinese, Hindi) toxic words lists which is used for [CLEF TextDetox 2024](https://pan.webis.de/clef24/pan24-web/text-detoxification.html) shared task.
The list of original sources:
* English: [link](https://github.com/coffee-and-fun/google-profanity-words/blob/main/data/en.txt)
* Russian: [link](https://github.com/s-nlp/rudetoxifier/blob/main/data/train/MAT_FINAL_with_unigram_inflections.txt)
* Ukrainian: [link](https://github.com/saganoren/obscene-ukr)
* Spanish: [link](https://github.com/facebookresearch/flores/blob/main/toxicity/README.md)
* German: [link](https://github.com/LDNOOBW/List-of-Dirty-Naughty-Obscene-and-Otherwise-Bad-Words)
* Amhairc: ours
* Arabic: ours
* Hindi: [link](https://github.com/facebookresearch/flores/blob/main/toxicity/README.md)
All credits go to the authors of the original toxic words lists.
|
yjernite/prof_report__22h-vintedois-diffusion-v0-1__multi__12 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: accountant
num_bytes: 1648
num_examples: 2
- name: aerospace_engineer
num_bytes: 1744
num_examples: 6
- name: aide
num_bytes: 1768
num_examples: 7
- name: air_conditioning_installer
num_bytes: 1648
num_examples: 2
- name: architect
num_bytes: 1720
num_examples: 5
- name: artist
num_bytes: 1816
num_examples: 9
- name: author
num_bytes: 1696
num_examples: 4
- name: baker
num_bytes: 1696
num_examples: 4
- name: bartender
num_bytes: 1648
num_examples: 2
- name: bus_driver
num_bytes: 1792
num_examples: 8
- name: butcher
num_bytes: 1696
num_examples: 4
- name: career_counselor
num_bytes: 1696
num_examples: 4
- name: carpenter
num_bytes: 1648
num_examples: 2
- name: carpet_installer
num_bytes: 1696
num_examples: 4
- name: cashier
num_bytes: 1720
num_examples: 5
- name: ceo
num_bytes: 1696
num_examples: 4
- name: childcare_worker
num_bytes: 1696
num_examples: 4
- name: civil_engineer
num_bytes: 1744
num_examples: 6
- name: claims_appraiser
num_bytes: 1744
num_examples: 6
- name: cleaner
num_bytes: 1744
num_examples: 6
- name: clergy
num_bytes: 1696
num_examples: 4
- name: clerk
num_bytes: 1792
num_examples: 8
- name: coach
num_bytes: 1696
num_examples: 4
- name: community_manager
num_bytes: 1720
num_examples: 5
- name: compliance_officer
num_bytes: 1672
num_examples: 3
- name: computer_programmer
num_bytes: 1648
num_examples: 2
- name: computer_support_specialist
num_bytes: 1720
num_examples: 5
- name: computer_systems_analyst
num_bytes: 1744
num_examples: 6
- name: construction_worker
num_bytes: 1696
num_examples: 4
- name: cook
num_bytes: 1696
num_examples: 4
- name: correctional_officer
num_bytes: 1720
num_examples: 5
- name: courier
num_bytes: 1768
num_examples: 7
- name: credit_counselor
num_bytes: 1672
num_examples: 3
- name: customer_service_representative
num_bytes: 1696
num_examples: 4
- name: data_entry_keyer
num_bytes: 1744
num_examples: 6
- name: dental_assistant
num_bytes: 1648
num_examples: 2
- name: dental_hygienist
num_bytes: 1624
num_examples: 1
- name: dentist
num_bytes: 1648
num_examples: 2
- name: designer
num_bytes: 1720
num_examples: 5
- name: detective
num_bytes: 1696
num_examples: 4
- name: director
num_bytes: 1720
num_examples: 5
- name: dishwasher
num_bytes: 1720
num_examples: 5
- name: dispatcher
num_bytes: 1672
num_examples: 3
- name: doctor
num_bytes: 1672
num_examples: 3
- name: drywall_installer
num_bytes: 1672
num_examples: 3
- name: electrical_engineer
num_bytes: 1744
num_examples: 6
- name: electrician
num_bytes: 1672
num_examples: 3
- name: engineer
num_bytes: 1696
num_examples: 4
- name: event_planner
num_bytes: 1696
num_examples: 4
- name: executive_assistant
num_bytes: 1696
num_examples: 4
download_size: 86326
dataset_size: 85232
---
# Dataset Card for "prof_report__22h-vintedois-diffusion-v0-1__multi__12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
camenduru/microsoft-XPretrain | ---
dataset_info:
features:
- name: video_id
dtype: string
- name: url
dtype: string
- name: clip
list:
- name: clip_id
dtype: string
- name: span
sequence: string
splits:
- name: train
num_bytes: 6163343290
num_examples: 3281091
download_size: 1757807231
dataset_size: 6163343290
---
# Dataset Card for "microsoft-XPretrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sanps/GutenbergFictionSummary | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: text_sub_id
dtype: int64
- name: text
dtype: string
- name: tokens
dtype: int64
- name: generated_text
dtype: string
splits:
- name: train
num_bytes: 1845974229
num_examples: 393386
download_size: 1156726889
dataset_size: 1845974229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
language:
- en
pretty_name: Gutenberg Fiction Books + Summaries
---
Text from english books from gutenberg.org with fiction tag and at least 25 downloads, split into paragraphs.
Original dataset: sanps/GutenbergFiction
Summarization with cognitivecomputations/dolphin-2.6-mistral-7b
For license details see: https://www.gutenberg.org/policy/permission.html |
carblacac/twitter-sentiment-analysis | ---
pretty_name: "TSATC: Twitter Sentiment Analysis Training Corpus"
annotations_creators:
- expert-generated
language_creators:
- other
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- feeling-classification
paperswithcode_id: other
configs:
- None
---
# Dataset Card for TSATC: Twitter Sentiment Analysis Training Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [TSATC](https://github.com/cblancac/SentimentAnalysisBert/blob/main/data)
- **Repository:** [TSATC](https://github.com/cblancac/SentimentAnalysisBert/blob/main/data)
- **Paper:** [TSATC: Twitter Sentiment Analysis Training Corpus](http://thinknook.com/twitter-sentiment-analysis-training-corpus-dataset-2012-09-22/)
- **Point of Contact:** [Carlos Blanco](carblacac7@gmail.com)
### Dataset Summary
TSATC: Twitter Sentiment Analysis Training Corpus
The original Twitter Sentiment Analysis Dataset contains 1,578,627 classified tweets, each row is marked as 1 for positive sentiment and 0 for negative sentiment. It can be downloaded from http://thinknook.com/wp-content/uploads/2012/09/Sentiment-Analysis-Dataset.zip.
The dataset is based on data from the following two sources:
University of Michigan Sentiment Analysis competition on Kaggle
Twitter Sentiment Corpus by Niek Sanders
This dataset has been transformed, selecting in a random way a subset of them, applying a cleaning process, and dividing them between the test and train subsets, keeping a balance between the number of positive and negative tweets within each of these subsets. These two files can be founded on https://github.com/cblancac/SentimentAnalysisBert/blob/main/data.
Finally, the train subset has been divided in two smallest datasets, train (80%) and validation (20%). The final dataset has been created with these two new subdatasets plus the previous test dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
Below are two examples from the dataset:
| | Text | Feeling |
| :-- | :---------------------------- | :------ |
| (1) | blaaah. I don't feel good aagain. | 0 |
| (2) | My birthday is coming June 3. | 1 |
### Data Fields
In the final dataset, all files are in the JSON format with f columns:
| Column Name | Data |
| :------------ | :-------------------------- |
| text | A sentence (or tweet) |
| feeling | The feeling of the sentence |
Each feeling has two possible values: `0` indicates the sentence has a negative sentiment, while `1` indicates a positive feeling.
### Data Splits
The number of examples and the proportion sentiments are shown below:
| Data | Train | Validation | Test |
| :------------------ | ------: | ------------: | ----: |
| Size | 119.988 | 29.997 | 61.998 |
| Labeled positive | 60.019 | 14.947 | 31029 |
| Labeled negative | 59.969 | 15.050 | 30969 |
## Dataset Creation
### Curation Rationale
Existing paraphrase identification datasets lack sentence pairs that have high lexical overlap without being paraphrases. Models trained on such data fail to distinguish pairs like *flights from New York to Florida* and *flights from Florida to New York*.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Mentioned above.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Citation Information
```
@InProceedings{paws2019naacl,
title = {{TSATC: Twitter Sentiment Analysis Training Corpus}},
author = {Ibrahim Naji},
booktitle = {thinknook},
year = {2012}
}
```
### Contributions
Thanks to myself [@carblacac](https://github.com/cblancac/) for adding this transformed dataset from the original one. |
CyberHarem/lematin_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lematin/ルミタン (Pokémon)
This is the dataset of lematin/ルミタン (Pokémon), containing 28 images and their tags.
The core tags of this character are `breasts, green_hair, hat, green_eyes, drill_hair, large_breasts, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 16.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lematin_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 13.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lematin_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 54 | 22.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lematin_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 16.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lematin_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 54 | 27.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lematin_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lematin_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, cleavage, solo, dress, huge_breasts, smile, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | elbow_gloves | cleavage | solo | dress | huge_breasts | smile | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:-----------|:-------|:--------|:---------------|:--------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
naem1023/final_aug_2000 | ---
license: afl-3.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.